当前位置:网站首页>Pytorch summary learning series - operation
Pytorch summary learning series - operation
2022-06-29 09:21:00 【TJMtaotao】
Arithmetic operations
stay PyTorch in , The same operation can take many forms , Next ⾯ Use addition as 例⼦.
Additive form ⼀
x = torch.tensor([5.5, 3])
y = torch.rand(5, 3)
print(x + y)
Additive form ⼆
print(torch.add(x, y))
You can also specify the output :
result = torch.empty(5, 3)
torch.add(x, y, out=result)
print(result)
Addition form 3 、inplace
# adds x to y
y.add_(x)
print(y)
Indexes
We can also make ⽤ Use similar NumPy To access Tensor Of ⼀ part , It should be noted that : The indexed results are similar to
Original data shared memory , That is, modify a , The other one will be modified .
y = x[0, :]
y += 1
print(y)
print(x[0, :]) # Source tensor Also changed 了
Except often ⽤ Index of the selected data ,PyTorch Some advanced selection functions are also provided :

Change shape
⽤ use view() To change Tensor The shape of the :
y = x.view(15)
z = x.view(-1, 5) # -1 The dimension can be derived from the values of other dimensions
print(x.size(), y.size(), z.size())
Output
torch.Size([5, 3]) torch.Size([15]) torch.Size([3, 5])
Be careful view() Back to the new tensor With the source tensor Shared memory ( In fact, it is the same as ⼀ One tensor), That is to say 更 Change one of them ⼀ One , another
Outside ⼀ One will change with it .( seeing the name of a thing one thinks of its function ,view It just changed 了 For this tensor 量 The observation of ⻆ angle )
x += 1
print(x)
print(y) # Also add 了1
So if we Want to go back ⼀ A really new copy ( namely 不 Shared memory ) What to do ?Pytorch It also provides 了⼀ One
individual reshape() You can change the shape , But this function does not 不 What can guarantee to return is its copy ⻉ shellfish , So don't 不 Recommend to make ⽤ use . Recommend first
use clone Create a copy and then make ⽤ view
x_cp = x.clone().view(15)
x -= 1
print(x)
print(x_cp)
send ⽤ use clone also ⼀ One advantage is that it will be recorded in the calculation diagram , That is, when the gradient is returned to the replica, it will also be transmitted to the source Tensor .
in addition ⼀ A regular ⽤ The function used is item() , It can be ⼀ A target 量 Tensor convert to ⼀ One Python number:
x = torch.randn(1)
print(x)
print(x.item())
Output
tensor([2.3466])
2.3466382026672363
linear algebra
in addition ,PyTorch also ⽀ Some linear functions are supported , There is no need to ⼰ Making a wheel ⼦, For specific usage, please refer to the official ⽅
file . As shown in the following table :

边栏推荐
- H5 soft keyboard problem
- Share code snippets of wechat applet
- pytorch总结学习系列-广播机制
- 微信小程序项目:微信小程序页面布局
- Summary of IO streams
- MT yolov6 training and testing
- Wechat applet project: tab navigation bar
- [to.Net] C data model, from Entity Framework core to LINQ
- Abstract classes and interfaces
- train_ on_ Batch save the image of the loss function change
猜你喜欢

Highlight in the middle of the navigation bar at the bottom of wechat applet

手写 redux-thunk

微信小程序子组件向页面传值(父子组件间的通信)带源码

Lffd: a lightweight fast face detector for edge detection

记微信小程序分享代码片段
![[to.Net] C data model, from Entity Framework core to LINQ](/img/98/6a8b295d1465697945e01b8c48ec52.png)
[to.Net] C data model, from Entity Framework core to LINQ

微信小程序项目:tab导航栏

YOLACT实时实例分割

来个小总结吧

Find the most repeated element in the string
随机推荐
SSD improvements cfenet
pytorch总结学习系列-广播机制
UE4 插件报错 Cannot open include file: ‘ModuleManager.h‘解决
Abstract classes and interfaces
What exactly does Devops mean?
SSD改进CFENet
基于区域注意的通用目标检测
深卷积神经网络时代的目标检测研究进展
Wechat applet opens file stream
Unity C # e-learning (12) -- protobuf generation protocol
微信小程序wx.navigateBack返回上一页携带参数
Redo after JS rotation view (longer full version, can be run)
微信小程序判断url的文件格式
What is the difference between hyperconverged architecture and traditional architecture?
Write down some written test questions
MYSQL行转列例子
Handwriting Redux thunk
调试H5页面-vConsole
pytorch总结学习系列-数据操作
H5 soft keyboard problem