当前位置:网站首页>Pytorch's leafnode understanding

Pytorch's leafnode understanding

2022-06-13 01:04:00 kaims

https://blog.csdn.net/Azahaxia/article/details/117234505

https://blog.csdn.net/byron123456sfsfsfa/article/details/92210253

Set up requires_grad=True Of tensor Accounting gradient , But not for all requires_grad=True Of tensor Save the gradient . Because in order to reduce video memory / Memory footprint , We won't save the... Generated during the intermediate calculation tensor Gradient of . therefore , With the concept of leaf nodes . If and only if tensor It's a leaf node , And the gradient needs to be calculated , We will save the calculated gradient for it , For back propagation updates .

Leaf nodes : about requires_grad=False perhaps grad_fn=None Of tensor, Both are regarded as leaf nodes , namely is_leaf=True.

analysis : about requires_grad = False Of tensor Come on , We conventionally classify them as leaf tensors . But in fact, no matter how it is divided, it has no effect , Because of the tensor is_leaf Attributes only make sense when they need to be derived . When requires_grad = True When , And when this tensor It's when the user creates , It is a leaf node , When this tensor Is generated by other operations , It is not a leaf node .

Why do I need leaf nodes : As mentioned above , To reduce video memory / Memory footprint , Because we don't want to generate for the intermediate computing process tensor Save the gradient , Even their requires_grad=True( That is, they need to calculate the gradient ).

summary : User built (grad_fn=None) Or you don't need to calculate the gradient (requires_grad=False)tensor It's a leaf node .requires_grad=True Of tensor Nodes need to calculate the gradient , However, the gradient needs to be saved and the tensor It's a leaf node . So , The gradient needs to be saved grad_fn=None and requires_grad=True.

原网站

版权声明
本文为[kaims]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202280556359493.html