当前位置:网站首页>"Hands on learning in depth" Chapter 2 - preparatory knowledge_ 2.1 data operation_ Learning thinking and exercise answers
"Hands on learning in depth" Chapter 2 - preparatory knowledge_ 2.1 data operation_ Learning thinking and exercise answers
2022-07-08 02:10:00 【coder_ sure】
List of articles
2.1 Data manipulation
One 、 Summary of key contents
1. Deconstruction of variables
This concept was vague before me , Teacher Li Mu mentioned here , Make a note of :
Take an example
Y = torch.tensor([[2.0, 1, 4, 3], [1, 2, 3, 4], [4, 3, 2, 1]])
X = torch.arange(12, dtype=torch.float32).reshape((3,4))
before = id(Y)
Y = Y + X
id(Y) == before
Output results :
False
This example shows that when running some operations , It may cause memory allocation for the new structure .
take Y
The address of is recorded in before
variable , And then we did Y + X
Assigned to a new variable Y
The operation of , Although on the surface, it is still Y
, But this Y
It's not that Y
了 !Y
The address of has changed .
2. Perform in place operation
Continue the above example , We can operate in situ in this way :
Z = torch.zeros_like(Y)
print('id(Z):', id(Z))
Z[:] = X + Y
print('id(Z):', id(Z))
Output results :
id(Z): 140040758378960
id(Z): 140040758378960
Construct a and Y Of the same dimension Z,Z All elements of are 0, after Z[:] = X + Y
operation , It's equivalent to Z An adaptation of the element of , The address doesn't change , Realize in-situ operation .
Two 、 Exercise answer
1. Run the code in this section . The conditional statements in this section X == Y Change to X < Y or X > Y, Then see what kind of tensor you can get .
X = torch.arange(12, dtype=torch.float32).reshape((3,4))
Y = torch.tensor([[2.0, 1, 4, 3], [1, 2, 3, 4], [4, 3, 2, 1]])
X, Y
Output results :
(tensor([[ 0., 1., 2., 3.],
[ 4., 5., 6., 7.],
[ 8., 9., 10., 11.]]), tensor([[2., 1., 4., 3.],
[1., 2., 3., 4.],
[4., 3., 2., 1.]]))
X > Y
Output results :
tensor([[False, False, False, False],
[ True, True, True, True],
[ True, True, True, True]])
2. Use other shapes ( For example, three-dimensional tensors ) Replace two tensors operated by elements in the broadcast mechanism . Whether the result is the same as expected ?
A = torch.tensor([[[1, 2, 3], [4, 5, 6]]])
B = torch.tensor([[[10, 20, 30]], [[40, 50, 60]]])
C = A + B
A.shape,B.shape, C, C.shape
Output results :
(torch.Size([1, 2, 3]), torch.Size([2, 1, 3]), tensor([[[11, 22, 33],
[14, 25, 36]],
[[41, 52, 63],
[44, 55, 66]]]), torch.Size([2, 2, 3]))
边栏推荐
猜你喜欢
I don't know. The real interest rate of Huabai installment is so high
Deeppath: a reinforcement learning method of knowledge graph reasoning
《ClickHouse原理解析与应用实践》读书笔记(7)
Partage d'expériences de contribution à distance
Keras' deep learning practice -- gender classification based on inception V3
银行需要搭建智能客服模块的中台能力,驱动全场景智能客服务升级
[target tracking] |atom
进程和线程的退出
Beaucoup d'enfants ne savent pas grand - chose sur le principe sous - jacent du cadre orm, non, ice River vous emmène 10 minutes à la main "un cadre orm minimaliste" (collectionnez - le maintenant)
Kwai applet guaranteed payment PHP source code packaging
随机推荐
EMQX 5.0 发布:单集群支持 1 亿 MQTT 连接的开源物联网消息服务器
力争做到国内赛事应办尽办,国家体育总局明确安全有序恢复线下体育赛事
Neural network and deep learning-5-perceptron-pytorch
Exit of processes and threads
CorelDRAW2022下载安装电脑系统要求技术规格
Introduction to Microsoft ad super Foundation
The circuit is shown in the figure, r1=2k Ω, r2=2k Ω, r3=4k Ω, rf=4k Ω. Find the expression of the relationship between output and input.
Reading notes of Clickhouse principle analysis and Application Practice (7)
MQTT X Newsletter 2022-06 | v1.8.0 发布,新增 MQTT CLI 和 MQTT WebSocket 工具
node js 保持长连接
[target tracking] |dimp: learning discriminative model prediction for tracking
分布式定时任务之XXL-JOB
Wechat applet uniapp page cannot jump: "navigateto:fail can not navigateto a tabbar page“
Node JS maintains a long connection
生命的高度
快手小程序担保支付php源码封装
SQLite3 data storage location created by Android
喜欢测特曼的阿洛
nmap工具介紹及常用命令
Alo who likes TestMan