当前位置:网站首页>【pytorch】nn. AdaptiveMaxPool2d
【pytorch】nn. AdaptiveMaxPool2d
2022-07-01 09:08:00 【Enzo tried to smash the computer】
output_size – the target output size of the image of the form H x W. Can be a tuple (H, W) or a single H for a square image H x H. H and W can be either a int, or None which means the size will be the same as that of the input.
return_indices – if True, will return the indices along with the outputs. Useful to pass to nn.MaxUnpool2d. Default: False
m = torch.nn.AdaptiveMaxPool2d((5, 7))
input = torch.randn(1, 64, 8, 9)
output = m(input)
print(output.shape) # torch.Size([1, 64, 5, 7])
# target output size of 7x7 (square)
m = nn.AdaptiveAvgPool2d(7)
input = torch.randn(1, 64, 10, 9)
print(m(input).shape) # [1, 64, 7, 7]
# target output size of 10x7
m = nn.AdaptiveAvgPool2d((None, 7))
input = torch.randn(1, 64, 10, 9) # [1, 64, 10, 7]
output = m(input)
print(output.shape)
import torch.nn as nn
import torch
input = torch.tensor([[[[2, 4, 8, 15],
[3, 6, 9, 19],
[7, 22, 5, 12],
[1, 66, 1, 77]]]], dtype=torch.float64)
m = nn.AdaptiveMaxPool2d((3, 3))
output = m(input)
print(input)
print(output)
# padding I made it up myself , Convenient observation
# tensor([[[[0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
# [0.0, 2., 4., 8., 15., 0.0],
# [0.0, 3., 6., 9., 19., 0.0],
# [0.0, 7., 22., 5., 12., 0.0],
# [0.0, 1., 66., 1., 77., 0.0],
# [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]]]], dtype=torch.float64)
# tensor([[[[ 6., 9., 19.],
# [22., 22., 19.],
# [66., 66., 77.]]]], dtype=torch.float64)
Global average pooling GAP(Global Average Pooling) The advantage is that :
Categories on Feature Map The connection between them is more intuitive ( Compared with the black box with full connection layer ),Feature Map It is also easier to convert into classification probability , Because in GAP No parameters need to be adjusted in , Therefore, the problem of over fitting is avoided .
GAP Summarizes spatial information , Therefore, it is more robust to the spatial transformation of input , Therefore, the following full connection layers in convolutional networks , Mostly with GAP Replace .
Global pooling occurs in Kreas There are corresponding layers in , Maximize the pool layer for the global (GlobalMaxPooling2D), But in Pytorch Although there is no corresponding pool layer in , But you can use Pytorch Adaptive pooling layer in (==AdaptiveMaxPool2d(1) perhaps AdaptiveMaxPool2d(1)== To achieve )
边栏推荐
- 【pytorch】transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
- It technology ebook collection
- Pain points and solutions of equipment management in large factories
- 又到年中,固定资产管理该何去何从?
- Differences among tasks, threads and processes
- 易点易动助力企业设备高效管理,提升设备利用率
- Embedded Engineer Interview frequently asked questions
- Redis——Lettuce连接redis集群
- Shell script - positional parameters (command line parameters)
- Shell script case in statement
猜你喜欢

NiO zero copy

TV size and viewing distance

It is designed with high bandwidth, which is almost processed into an open circuit?

Phishing identification app

Which method is good for the management of fixed assets of small and medium-sized enterprises?

Nacos - 配置管理

FreeRTOS learning easy notes

Ranking list of domestic databases in February, 2022: oceanbase regained the "three consecutive increases", and gaussdb is expected to achieve the largest increase this month

2.3 【kaggle数据集 - dog breed 举例】数据预处理、重写Dataset、DataLoader读取数据

3D打印Arduino 四轴飞行器
随机推荐
Class loading
Embedded Engineer Interview Question 3 Hardware
Jeecg restart alarm 40001
3D打印Arduino 四轴飞行器
固定资产管理系统让企业动态掌握资产情况
Nacos - gestion de la configuration
Reproduced Xray - cve-2017-7921 (unauthorized access by Hikvision)
如何高效拉齐团队认知
Shell script -for loop and for int loop
Shell脚本-while循环详解
Pain points and solutions of fixed assets management of group companies
Insert mathematical formula in MD document and mathematical formula in typora
Microcomputer principle - bus and its formation
又到年中,固定资产管理该何去何从?
Flink面试题
Log4j 日志框架
Ape anthropology topic 20 (the topic will be updated from time to time)
Differences among tasks, threads and processes
AVL树的理解和实现
Redis source code learning (29), compressed list learning, ziplist C (II)