当前位置:网站首页>"Torch" tensor multiplication: matmul, einsum
"Torch" tensor multiplication: matmul, einsum
2022-08-01 20:01:00 【panbaoran913】
参考博文:《张量相乘matmul函数》
一、torch.matmul
matmul(input, other, out = None) 函数对 input 和 other Matrix multiplication of two tensors.torch.matmul The function has many overloaded functions depending on the tensor dimension of the passed arguments.
When multiplying tensors,并不是标准的 ( m , n ) × ( n , l ) = ( m , l ) (m,n) \times (n,l) =(m,l) (m,n)×(n,l)=(m,l)的形式.
三、1D and 2D multiplication
3.1 1D multiplied by 2D: ( m ) × ( m , n ) = ( n ) (m) \times (m,n)=(n) (m)×(m,n)=(n)
A1 =torch.FloatTensor(size=(4,))
A2=torch.FloatTensor(size=(4,3))
A12=torch.matmul(A1,A2)
A12.shape # (3,)
3.2 Two-dimensional by one-dimensional: ( m , n ) ∗ ( n ) = ( m ) (m,n)*(n)=(m) (m,n)∗(n)=(m)
A3=torch.FloatTensor(size=(3,4))
A31=torch.matmul(A3,A1)
A31.shape #(3,)
四、Multiply 2D and 3D
4.1 2D multiplication3维: ( m , n ) × ( b , n , l ) = ( b , m , l ) (m,n)\times (b, n, l)=(b, m, l) (m,n)×(b,n,l)=(b,m,l).The expansion plan is ( b , m , n ) × ( b , n , l ) = ( b , m , l ) (b, m,n)\times (b, n,l) =(b, m,l) (b,m,n)×(b,n,l)=(b,m,l)
B1=torch.FloatTensor(size=(2,3))
B2=torch.FloatTensor(size=(5,3,4))
B12=torch.matmul(B1,B2)
B12.shape #(5,2,4)
等价方案:
B12_=torch.einsum("ij,bjk->bik",B1,B2)
torch.sum(B12==B12_)#40=2*4*5
4.2 3D times 2D: ( b , m , n ) × ( n , l ) = ( b , m , l ) (b, m, n)\times (n,l)=(b, m,l) (b,m,n)×(n,l)=(b,m,l).
B2=torch.FloatTensor(size=(5,3,4))
B3=torch.FloatTensor(size=(4,2))
B23=torch.matmul(B2,B3)
B23.shape #(5,3,2)
等价方案:
BB23_ =torch.einsum("bij,jk->bik",[B2,B3])
BB23_.shape #(5,3,2)
torch.sum(B23==BB23_)#30=5*3*2
4. 3 Two-dimensional expansion into three-dimensional way
方式一:The first tensor is expanded from two dimensions to three dimensions
B1(2,3)–>B1_(5,2,3)
B1=torch.FloatTensor(size=(2,3))
B1_ =torch.unsqueeze(B1,axis=0) #升维
print(B1_.shape) #torch.Size([1, 2, 3])
B11 =torch.cat([B1_,B1_,B1_,B1_,B1_],axis=0)#合并-->扩维
print(B11.shape) #torch.Size([5, 2, 3])
比较 B 1 ( 2 , 3 ) × B 2 ( 5 , 3 , 4 ) 与 B 11 ( 5 , 2 , 3 ) × B 2 ( 5 , 3 , 4 ) B1(2,3)\times B2(5,3,4)与B11(5,2,3)\times B2(5,3,4) B1(2,3)×B2(5,3,4)与B11(5,2,3)×B2(5,3,4)的结果
B112=torch.matmul(B11,B2)#(5,2,3)*(5,3,4)
torch.sum(B112==B12)#40=5*2*3
Indicates that both values are exactly the same.Let's further explore the mechanism of its multiplication.
我们拿B1(2,3)与B2(5,3,4)Multiply the first matrix in ,to see if it is equal to the first matrix in . The following proofs are equivalent
B12_0=torch.matmul(B1,B2[0])
B112[0]==B12_[0]
out:
tensor([[True, True, True, True],
[True, True, True, True]])
2Dimension multiplied by3Dimensional Matrix Demonstration Diagram
方式二:The second tensor is expanded from two dimensions to three dimensions
B3(4,2)–>B3_(5, 4, 2)
B3_=torch.unsqueeze(B3,axis=0)
print(B3_.shape)#(1,4,2)
B33 =torch.cat([B3_,B3_,B3_,B3_,B3_],axis=0)
print(B33.shape)#(5,4,2)
B233 =torch.matmul(B2,B33)
print(B233.shape) #(5,3,2)
Compare the results of the two multiplications:
print(torch.sum(B233==B23_)) #30
print(torch.sum(B233==B23)) #30
提醒:torch的FloatTensor中出现了nan值,seems to be unequal.
五、Multiply 2D and 4D
5.1 Two-dimensional by four-dimensional: ( m , n ) × ( b , c , n , l ) = ( b , c , m , l ) (m,n)\times (b,c,n,l) =(b,c,m,l) (m,n)×(b,c,n,l)=(b,c,m,l)
B1=torch.FloatTensor(size=(2,3))
B4 =torch.FloatTensor(size=(7,5,3,4))
B14 =torch.matmul(B1,B4)
print(B14.shape) #(7, 5, 2, 4)
等价方案
B14_= torch.einsum("mn,bcnl->bcml",[B1,B4])
print(torch.sum(B14==B14_))#280=7*5*2*4
升维
## 升维
B11 = torch.unsqueeze(B1,dim=0)
B11 = torch.concat([B11,B11,B11,B11,B11],dim=0)
print(B11.shape)#(5,2,3)
B111 = torch.unsqueeze(B11,dim=0)
B111 =torch.concat([B111,B111,B111,B111,B111,B111,B111],dim = 0)
print(B111.shape)#(7,5,2,3)
广播后的4Dimension multiplied by4维
B1114 = torch.matmul(B111,B4)
print(B1114.shape)#(7,5,3,4)
print(torch.sum(B1114==B14))#280
5.2 Four dimensions multiplied by two dimensions: ( b , c , n , l ) × ( l , p ) = ( b , c , n , p ) (b,c,n,l) \times (l,p)= (b,c,n,p) (b,c,n,l)×(l,p)=(b,c,n,p)
4Dimension multiplied by2维
B43 = torch.matmul(B4,B3)
print("B43 shape",B43.shape) #(7,5,3,2)
等价形式
B43_ = torch.einsum("bcnl,lp->bcnp",[B4,B3])
print("B4 is nan",torch.sum(B4.isnan()))#0
print(torch.sum(B43==B43_))#210 =7*5*3*2
升维
B33 =torch.unsqueeze(B3,dim=0)
B33 = torch.concat([B33,B33,B33,B33,B33],dim =0)
B333 = torch.unsqueeze(B33,dim =0)
B333 =torch.concat([B333,B333,B333,B333,B333,B333,B333],dim =0)
print("B333 shape is",B333.shape)#(7,5,4,2)
广播后4Dimension multiplied by4维
B4333 =torch.matmul(B4,B333)
print("B4333 shape is",B4333.shape)#(7,5,3,2)
边栏推荐
- 【社媒营销】如何知道自己的WhatsApp是否被屏蔽了?
- Creo5.0草绘如何绘制正六边形
- 密码学的基础:X.690和对应的BER CER DER编码
- 【kali-信息收集】(1.3)探测网络范围:DMitry(域名查询工具)、Scapy(跟踪路由工具)
- 油猴hook小脚本
- XSS range intermediate bypass
- 【节能学院】推进农业水价综合改革的意见解读
- Greenplum Database Source Code Analysis - Analysis of Standby Master Operation Tools
- 小数据如何学习?吉大最新《小数据学习》综述,26页pdf涵盖269页文献阐述小数据学习理论、方法与应用
- Gradle系列——Gradle文件操作,Gradle依赖(基于Gradle文档7.5)day3-1
猜你喜欢
随机推荐
Arthas 常用命令
【七夕特别篇】七夕已至,让爱闪耀
1个小时!从零制作一个! AI图片识别WEB应用!
第57章 业务逻辑之业务实体与数据库表的映射规则定义
我的驾照考试笔记(3)
deploy zabbix
mysql解压版简洁式本地配置方式
【nn.Parameter()】生成和为什么要初始化
专利检索常用的网站有哪些?
How PROE/Croe edits a completed sketch and brings it back to sketching state
给定中序遍历和另外一种遍历方法确定一棵二叉树
18. Distributed configuration center nacos
【无标题】
58:第五章:开发admin管理服务:11:开发【管理员人脸登录,接口】;(未实测)(使用了阿里AI人脸识别)(演示了,使用RestTemplate实现接口调用接口;)
八百客、销售易、纷享销客各行其道
【Untitled】
LabVIEW 使用VISA Close真的关闭COM口了吗
CMake教程——Leeds_Garden
Risc-v Process Attack
【ES】ES2021 我学不动了,这次只学 3 个。









