当前位置:网站首页>6. Dimension transformation and broadcasting
6. Dimension transformation and broadcasting
2022-07-27 05:59:00 【Pie star's favorite spongebob】
List of articles
Dimensional transformation
One . .view()
Must meet the physical meaning
Suitable for full connection layer , Use flattened data
view The new tensor Of size Be consistent with the original .
a=torch.rand(4,1,28,28)
print(a.shape)
b=a.view(4,28*28)
c=a.view(4*1,28*28)
d=a.view(4*1*28,28)
print(b)
print(b.shape)
print(c.shape)
print(d.shape)

The fatal question
The original dimension information is lost ,[b,c,h,w]->[b,a], You can no longer directly reverse recover , Nor can the original dimension information be directly obtained from the latter , Additional storage of the original dimension information is required .
b=a.view(4,28*28)
print(b.view(4,28,28,1))
We can b Conduct view Operation (4,28,28,1), But I don't know a Under the premise of , Unable to return a.
Two . .suqeeze()/.unsqueeze()
1.squeeze Reduce dimensions
It can be understood as reducing dimensions
Range [-a.dim()-1,a.dim()+1), Positive and negative indexes are acceptable , The positive and negative numbers are equivalent to the reduced dimension at the index 1. Dimension is not 1 Can't squeeze , No mistake. , Return unchanged .
Give parameters
a=torch.rand([1,32,1,1])
print(a.shape)
b=a.squeeze(0)
c=a.squeeze(-1)
d=a.squeeze(1)
print(b.shape)
print(c.shape)
print(d.shape)
Dimension is not 1 Can't squeeze , No mistake. , Return unchanged .
No parameters
No parameters , All that can be squeezed , namely dim=1.
a=torch.rand([1,32,1,1])
print(a.shape)
b=a.squeeze()
print(b.shape)

2.unsqueeze Increase the dimension
Understood as adding dimensions , Add a group , Do not add data , The meaning of the group is up to you
Range [-a.dim()-1,a.dim()+1), Positive and negative indexes are acceptable , A positive number is equivalent to inserting a dimension before the index 1, A negative number is equivalent to inserting a dimension after the index 1.
polymeric 1 You can think of it as one batch, A group , No more data , Concept added .
hinder 1 It can be understood as , Made up out of nothing , It can be given meaning at will .
Positive numbers can contain all functions , Negative numbers are not recommended .
Within the parameter range
a=torch.rand(4,1,28,28)
print(a.shape)
print(a.unsqueeze(0).shape)
print(a.unsqueeze(1).shape)
print(a.unsqueeze(4).shape)
print(a.unsqueeze(-1).shape)
print(a.unsqueeze(-4).shape)
print(a.unsqueeze(-5).shape)
For this case ,(4,1,28,28), The writable index range is [-5,5), The positive indexes are 0,1,2,3, The negative indexes are -4,-3,-2,-1, When the input index is 4 when , And the input index is -1, They have the same effect , Is to insert a dimension at the end 1.
a=torch.tensor([1.2,3.2])
print(a.shape)
print(a)
b=a.unsqueeze(0)
c=a.unsqueeze(-1)
print(b.shape)
print(b)
print(c.shape)
print(c)

Out of range
print(a.unsqueeze(5).shape)

3、 ... and . .expand()/.repeat()
1.expand expansion ( recommend )
Just change the way of understanding , No more data
The parameter represents the expanded tensor
expand and repeat The two final effects are equivalent , The first is not actively copying data , When there is a need , Otherwise, the process of copying data is omitted .
a=torch.rand([1,32,1,1])
b=a.expand(4,32,14,14)
The premise is before and after expansion shape Agreement , Turned out to be 1 The dimension of can be expanded to n, Turned out to be x Of , It will not expand into y.
a=torch.rand([1,32,1,1])
c=a.expand(-1,32,-1,-1)
print(c.shape)
print(a.expand(-1,32,-1,-4).shape)
-1 Keep the same .-4 Turn into -4 This is a bug, This is in the latest facebook Repaired .
2.repeat repeat
It's also an extension , But actually increase the data . Will change the storage .
The parameter indicates the number of times to copy .
a=torch.rand([1,32,1,1])
b=a.repeat(4,1,4,4)
print(b.shape)
Storage becomes larger .

Four . .transpose()/.t()/.permute()
1. .t Transposition
Transposition , Only applicable to 2d Of tensor, Matrix
a=torch.rand(2,3)
print(a)
print(a.t())

Only suitable for two-dimensional matrix .
a=torch.rand(2,3,3)
print(a)
print(a.t())

2.transpose
Swap a dimension ,transpose[1,3]:[b,c,h,w]->[b,w,h,c]. It is equivalent to connecting the back together and then unfolding .
tips: The dimension order of data must be consistent with the storage order .
a=torch.rand(1,2,3,4)
b=a.transpose(1,2)
print(b.shape)

contiguous Become continuous
all All contents are consistent before returning true
3.permute
The number here is the original tensor Of tensor Indexes . Equivalent to any number of transpose.
tip:[b,h,w,c] yes numpy A format for storing pictures , This step is required to export numpy
a=torch.rand(1,2,3,4)
b=a.transpose(1,2)
c=a.permute(0,2,1,3)
print(b.shape)
print(c.shape)

Broadcasting Automatic extension
expand, And it is implemented automatically
No need to copy data
It can be understood that an insertion dimension uses a unsqueaze, Use one more expand To copy data ( Logically copy )
Key points : First of all , There is no dimension in front. Insert a new dimension in front ,shape by 1, second , hold dim by 1 The expansion of is and A The same size
feature maps:A【4,32,14,14】
B【 32,1,1】 Insert a dimension in front as 1
B【 1, 32,1,1】
bias:【32,1,1】 The back two 1 Manual insertion ,
why broadcasting
1. Actual needs
[ class , Student , score ]
Now add 5 branch , namely bias=5
5: The dimension of is [1]:unsqueaze(0).unsqueazse(0).expand_as(A)
available 【1,1,1】->【4,32,8】
Automatic extension makes it unnecessary to manually extend this part of the code , what's more , Not used in reality expand It is repeate, Greatly increase the required storage space .
2. Storage consumption
To a 5 The storage , Conduct repeate You may need 【4,32,8】->1024
When to use
Example
Small dimensions specify , The big dimension is random , From the last dimension ,
Big Small
A【 Letter Rest 】
B nothing 【 According to the rules 】
If there is no front, insert 1, Then expand to the same size , Nothing else
for example :【4,32,8】
Bonus information :(0,0,0,5) The length is 4, Dimension for 1, But there are 8 Course , Only for 4 The results of the entrance examination , Unable to expand .
for example 【4,32,14,14】
Give it to 【2,32,14,14】
Manual required
边栏推荐
- golang控制goroutine数量以及获取处理结果
- vim编辑器全部删除文件内容
- Performance optimization of common ADB commands
- If the interviewer asks you about JVM, the extra answer of "escape analysis" technology will give you extra points
- MySQL索引分析除了EXPLAIN还有什么方法
- 8.数学运算与属性统计
- Handler操作记录 Only one Looper may be created per thread
- 2. Simple regression problem
- Day10. Work organization and mental health problems in PhD students
- 7.合并与分割
猜你喜欢
随机推荐
If you encounter oom online, how to solve it?
Deploy redis with docker for high availability master-slave replication
How to realize master-slave synchronization in mysql5.7
MySQL索引失效与解决方法实践
PHP的CI框架学习
GBASE 8C——SQL参考6 sql语法(11)
vim编辑器全部删除文件内容
数字图像处理 第二章 数字图像基础
Gbase 8C - SQL reference 6 SQL syntax (1)
GBASE 8C——SQL参考6 sql语法(15)
golang控制goroutine数量以及获取处理结果
Aquanee will land in gate and bitmart in the near future, which is a good opportunity for low-level layout
GBase 8c技术特点
Count the quantity in parallel after MySQL grouping
数字图像处理——第九章 形态学图像处理
GBASE 8C——SQL参考6 sql语法(9)
Getaverse, a distant bridge to Web3
18.卷积神经网络
How can seektiger go against the cold air in the market?
如果面试官问你 JVM,额外回答“逃逸分析”技术会让你加分









