当前位置:网站首页>[DL] introduction and understanding of tensor
[DL] introduction and understanding of tensor
2022-07-29 06:00:00 【Machines don't learn I learn】
1. Preface
In deep learning , We are sure to encounter a noun : tensor (tensor). For one dimension 、 Two dimensional is easy for us to understand , But three-dimensional 、 4 d 、…、n dimension , How do we understand ? Now we will pytorch Take the deep learning framework as an example to introduce in detail .
2. A one-dimensional
import torch # edition :1.8.0+cpu
a = torch.tensor([1,2,3,4])
print(a)
print(a.shape)
Output :
tensor([1, 2, 3, 4])
torch.Size([4])
The output has only one dimension , So it's one-dimensional .
3. A two-dimensional
import torch # edition :1.8.0+cpu
a = torch.tensor([[1,2,3,4]])
print(a)
print(a.shape)
Output :
tensor([[1, 2, 3, 4]])
torch.Size([1, 4])
The output above has rows and columns , So it's a two-dimensional tensor , It's actually a two-dimensional matrix .
4. The three dimensional
import torch # edition :1.8.0+cpu
a = torch.ones(1,3,3)
print(a)
print(a.shape)
Output :
tensor([[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]]])
torch.Size([1, 3, 3])
From the output results, we can see that this tensor has three dimensions , There is one more dimension ahead 1. But I can't see this intuitively 1 Where is it reflected . Now let's look at a tensor , Intuitively feel where the front dimension is reflected :
import torch # edition :1.8.0+cpu
a = torch.ones(3,4,5)
print(a)
print(a.shape)
Output :
tensor([[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]]])
torch.Size([3, 4, 5])
Output the results from above , You can intuitively feel the front dimension ( Numbers 3) The embodiment of ;
The first number 3: Divide into 3 Big row
The second number 4: Each line is divided into 4 Xiaoxing
The third number 5: Each line is divided into 5 Small column
So the dimension of data is 3×4×5, The last number represents the dimension of the column ; We can also understand it as 3 individual 4 That's ok 5 Columns of data .
If we compare the above tensor to a RGB Image , Numbers 3 Express 3 Channels , The size of each channel is 4 That's ok 5 Column .
5. 4 d
import torch # edition :1.8.0+cpu
a = torch.ones(2,3,4,5)
print(a)
print(a.shape)
Output :
tensor([[[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]]],
[[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]]]])
torch.Size([2, 3, 4, 5])
We separate the above output with a split line :
The upper part of the red line is a “ dimension ”, Here is another “ dimension ”, So there are two dimensions .
Let's be a little bit more straightforward , Tensor a, Yes 2 Big row , Each line is divided again 3 Xiaoxing , Each line is divided again 4 That's ok , And then it was divided 5 Column .
tensor a It can be understood in the daily image data set :
The first number 2: In fact, that is batchsize, It is equivalent to that this tensor is input 2 Zhang image
The second number 3: The number of channels per image is 3
The third number 4: The image is high 4
The fourth number 5: The width of the image is 5
Reference resources :
https://mp.weixin.qq.com/s/9gdoufWGE8xOvwPvAcVEGw
https://blog.csdn.net/z240626191s/article/details/124204965
My official account of WeChat :
Welcome to pay attention , Share related technologies from time to time .
边栏推荐
- 主流实时流处理计算框架Flink初体验。
- Reporting Services- Web Service
- Spring, summer, autumn and winter with Miss Zhang (2)
- Detailed explanation of atomic operation classes atomicreference and atomicstampedreference in learning notes of concurrent programming
- 钉钉告警脚本
- 深入理解MMAP原理,让大厂都爱不释手的技术
- 【DL】搭建卷积神经网络用于回归预测(数据+代码详细教程)
- 中海油集团,桌面云&网盘存储系统应用案例
- iSCSI vs iSER vs NVMe-TCP vs NVMe-RDMA
- Idea using JDBC to connect mysql database personal detailed tutorial
猜你喜欢
微信小程序源码获取(附工具的下载)
在uni-app项目中,如何实现微信小程序openid的获取
【综述】图像分类网络
Semaphore (semaphore) for learning notes of concurrent programming
Android Studio 实现登录注册-源代码 (连接MySql数据库)
“山东大学移动互联网开发技术教学网站建设”项目实训日志五
Laravel service container (inheritance and events)
Most PHP programmers don't understand how to deploy safe code
Thinkphp6 pipeline mode pipeline use
ssm整合
随机推荐
File文件上传的使用(2)--上传到阿里云Oss文件服务器
Activity交互问题,你确定都知道?
Laravel service container (inheritance and events)
Flink connector Oracle CDC 实时同步数据到MySQL(Oracle12c)
【DL】搭建卷积神经网络用于回归预测(数据+代码详细教程)
Spring, summer, autumn and winter with Miss Zhang (3)
Shanzhai coin Shib has a US $548.6 million stake in eth whale's portfolio - traders should be on guard
关于Flow的原理解析
Detailed steps of JDBC connection to database
Gluster cluster management analysis
Thinkphp6 pipeline mode pipeline use
Flink connector Oracle CDC 实时同步数据到MySQL(Oracle19c)
Centos7 silently installs Oracle
【Clustrmaps】访客统计
CMD window under Windows connects to MySQL and operates the table
iSCSI vs iSER vs NVMe-TCP vs NVMe-RDMA
xSAN高可用—XDFS与SAN融合焕发新生命力
【DL】关于tensor(张量)的介绍和理解
XDFS&空天院HPC集群典型案例
与张小姐的春夏秋冬(4)