当前位置:网站首页>【pytorch】nn. AdaptiveMaxPool2d
【pytorch】nn. AdaptiveMaxPool2d
2022-07-01 09:08:00 【Enzo tried to smash the computer】
output_size – the target output size of the image of the form H x W. Can be a tuple (H, W) or a single H for a square image H x H. H and W can be either a int, or None which means the size will be the same as that of the input.
return_indices – if True, will return the indices along with the outputs. Useful to pass to nn.MaxUnpool2d. Default: False
m = torch.nn.AdaptiveMaxPool2d((5, 7))
input = torch.randn(1, 64, 8, 9)
output = m(input)
print(output.shape) # torch.Size([1, 64, 5, 7])
# target output size of 7x7 (square)
m = nn.AdaptiveAvgPool2d(7)
input = torch.randn(1, 64, 10, 9)
print(m(input).shape) # [1, 64, 7, 7]
# target output size of 10x7
m = nn.AdaptiveAvgPool2d((None, 7))
input = torch.randn(1, 64, 10, 9) # [1, 64, 10, 7]
output = m(input)
print(output.shape)
import torch.nn as nn
import torch
input = torch.tensor([[[[2, 4, 8, 15],
[3, 6, 9, 19],
[7, 22, 5, 12],
[1, 66, 1, 77]]]], dtype=torch.float64)
m = nn.AdaptiveMaxPool2d((3, 3))
output = m(input)
print(input)
print(output)
# padding I made it up myself , Convenient observation
# tensor([[[[0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
# [0.0, 2., 4., 8., 15., 0.0],
# [0.0, 3., 6., 9., 19., 0.0],
# [0.0, 7., 22., 5., 12., 0.0],
# [0.0, 1., 66., 1., 77., 0.0],
# [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]]]], dtype=torch.float64)
# tensor([[[[ 6., 9., 19.],
# [22., 22., 19.],
# [66., 66., 77.]]]], dtype=torch.float64)
Global average pooling GAP(Global Average Pooling) The advantage is that :
Categories on Feature Map The connection between them is more intuitive ( Compared with the black box with full connection layer ),Feature Map It is also easier to convert into classification probability , Because in GAP No parameters need to be adjusted in , Therefore, the problem of over fitting is avoided .
GAP Summarizes spatial information , Therefore, it is more robust to the spatial transformation of input , Therefore, the following full connection layers in convolutional networks , Mostly with GAP Replace .
Global pooling occurs in Kreas There are corresponding layers in , Maximize the pool layer for the global (GlobalMaxPooling2D), But in Pytorch Although there is no corresponding pool layer in , But you can use Pytorch Adaptive pooling layer in (==AdaptiveMaxPool2d(1) perhaps AdaptiveMaxPool2d(1)== To achieve )
边栏推荐
- Nacos - 配置管理
- 【pytorch】2.4 卷积函数 nn.conv2d
- Nacos - service discovery
- 【pytorch】transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
- C language student information management system
- 【ESP 保姆级教程】疯狂毕设篇 —— 案例:基于阿里云和Arduino的化学环境系统检测,支持钉钉机器人告警
- [MFC development (16)] tree control
- Flink interview questions
- Phishing identification app
- AVL树的理解和实现
猜你喜欢

Vsync+ triple cache mechanism +choreographer

Jetson nano installs tensorflow GPU and problem solving

How to manage fixed assets efficiently in one stop?

【pytorch】2.4 卷积函数 nn.conv2d

Reproduced Xray - cve-2017-7921 (unauthorized access by Hikvision)

【pytorch】nn.CrossEntropyLoss() 与 nn.NLLLoss()

Principle and application of single chip microcomputer timer, serial communication and interrupt system

Understanding and implementation of AVL tree

中小企业固定资产管理办法哪种好?

NiO zero copy
随机推荐
Is it safe to dig up money and make new shares
Daily practice of C language - day 80: currency change
Shell脚本-for循环和for int循环
nacos簡易實現負載均衡
Programming with C language: calculate with formula: e ≈ 1+1/1+ 1/2! …+ 1/n!, Accuracy is 10-6
Imitation of Baidu search results top navigation bar effect
AVL树的理解和实现
Mysql8.0 learning record 17 -create table
日常办公耗材管理解决方案
Promise asynchronous programming
Redis——Lettuce连接redis集群
如何一站式高效管理固定资产?
JCL and slf4j
固定资产管理系统让企业动态掌握资产情况
MySQL optimization
pcl_ Viewer command
【电赛训练】红外光通信装置 2013年电赛真题
Leetcode daily question brushing record --540 A single element in an ordered array
Football and basketball game score live broadcast platform source code /app development and construction project
Shell脚本-位置参数(命令行参数)