当前位置:网站首页>[DL] introduction and understanding of tensor
[DL] introduction and understanding of tensor
2022-07-29 06:00:00 【Machines don't learn I learn】
1. Preface
In deep learning , We are sure to encounter a noun : tensor (tensor). For one dimension 、 Two dimensional is easy for us to understand , But three-dimensional 、 4 d 、…、n dimension , How do we understand ? Now we will pytorch Take the deep learning framework as an example to introduce in detail .
2. A one-dimensional
import torch # edition :1.8.0+cpu
a = torch.tensor([1,2,3,4])
print(a)
print(a.shape)
Output :
tensor([1, 2, 3, 4])
torch.Size([4])
The output has only one dimension , So it's one-dimensional .
3. A two-dimensional
import torch # edition :1.8.0+cpu
a = torch.tensor([[1,2,3,4]])
print(a)
print(a.shape)
Output :
tensor([[1, 2, 3, 4]])
torch.Size([1, 4])
The output above has rows and columns , So it's a two-dimensional tensor , It's actually a two-dimensional matrix .
4. The three dimensional
import torch # edition :1.8.0+cpu
a = torch.ones(1,3,3)
print(a)
print(a.shape)
Output :
tensor([[[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]]])
torch.Size([1, 3, 3])
From the output results, we can see that this tensor has three dimensions , There is one more dimension ahead 1. But I can't see this intuitively 1 Where is it reflected . Now let's look at a tensor , Intuitively feel where the front dimension is reflected :
import torch # edition :1.8.0+cpu
a = torch.ones(3,4,5)
print(a)
print(a.shape)
Output :
tensor([[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]]])
torch.Size([3, 4, 5])
Output the results from above , You can intuitively feel the front dimension ( Numbers 3) The embodiment of ;
The first number 3: Divide into 3 Big row
The second number 4: Each line is divided into 4 Xiaoxing
The third number 5: Each line is divided into 5 Small column
So the dimension of data is 3×4×5, The last number represents the dimension of the column ; We can also understand it as 3 individual 4 That's ok 5 Columns of data .
If we compare the above tensor to a RGB Image , Numbers 3 Express 3 Channels , The size of each channel is 4 That's ok 5 Column .
5. 4 d
import torch # edition :1.8.0+cpu
a = torch.ones(2,3,4,5)
print(a)
print(a.shape)
Output :
tensor([[[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]]],
[[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]],
[[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1.]]]])
torch.Size([2, 3, 4, 5])
We separate the above output with a split line :

The upper part of the red line is a “ dimension ”, Here is another “ dimension ”, So there are two dimensions .
Let's be a little bit more straightforward , Tensor a, Yes 2 Big row , Each line is divided again 3 Xiaoxing , Each line is divided again 4 That's ok , And then it was divided 5 Column .
tensor a It can be understood in the daily image data set :
The first number 2: In fact, that is batchsize, It is equivalent to that this tensor is input 2 Zhang image
The second number 3: The number of channels per image is 3
The third number 4: The image is high 4
The fourth number 5: The width of the image is 5

Reference resources :
https://mp.weixin.qq.com/s/9gdoufWGE8xOvwPvAcVEGw
https://blog.csdn.net/z240626191s/article/details/124204965
My official account of WeChat :
Welcome to pay attention , Share related technologies from time to time .
边栏推荐
- CMD window under Windows connects to MySQL and operates the table
- Print out all prime numbers between 1-100
- Huawei 2020 school recruitment written test programming questions read this article is enough (Part 2)
- 【Clustrmaps】访客统计
- 中海油集团,桌面云&网盘存储系统应用案例
- 【DL】搭建卷积神经网络用于回归预测(数据+代码详细教程)
- DataX installation
- datax安装
- Flink connector Oracle CDC 实时同步数据到MySQL(Oracle19c)
- Training log III of "Shandong University mobile Internet development technology teaching website construction" project
猜你喜欢

Spring, summer, autumn and winter with Miss Zhang (2)

Breaking through the hardware bottleneck (I): the development of Intel Architecture and bottleneck mining

“山东大学移动互联网开发技术教学网站建设”项目实训日志五

nacos外置数据库的配置与使用

D3.JS 纵向关系图(加箭头,连接线文字描述)

Lock lock of concurrent programming learning notes and its implementation basic usage of reentrantlock, reentrantreadwritelock and stampedlock
![[go] use of defer](/img/10/9e4e1c593870450c381a154f31ebef.png)
[go] use of defer

中海油集团,桌面云&网盘存储系统应用案例

Training log II of the project "construction of Shandong University mobile Internet development technology teaching website"

Super simple integration HMS ml kit face detection to achieve cute stickers
随机推荐
Intelligent security of the fifth space ⼤ real competition problem ----------- PNG diagram ⽚ converter
centos7 静默安装oracle
ReportingService WebService Form身份验证
Use of xtrabackup
与张小姐的春夏秋冬(4)
中海油集团,桌面云&网盘存储系统应用案例
Gluster集群管理小分析
Use of file upload (2) -- upload to Alibaba cloud OSS file server
Flink connector Oracle CDC 实时同步数据到MySQL(Oracle19c)
Flutter正在被悄悄放弃?浅析Flutter的未来
C# 连接 SharepointOnline WebService
关于Flow的原理解析
并发编程学习笔记 之 ReentrantLock实现原理的探究
C# 判断用户是手机访问还是电脑访问
rsync+inotyfy实现数据单项监控实时同步
Windos下安装pyspider报错:Please specify --curl-dir=/path/to/built/libcurl解决办法
Nailing alarm script
并发编程学习笔记 之 原子操作类AtomicReference、AtomicStampedReference详解
My ideal job, the absolute freedom of coder farmers is the most important - the pursuit of entrepreneurship in the future
【go】defer的使用