当前位置:网站首页>Pytorch - autograd automatic differentiation
Pytorch - autograd automatic differentiation
2022-07-28 15:27:00 【SpikeKing】
The paper :Automatic Differentiation in Machine Learning: a Survey, Automatic differentiation
Reference resources PyTorch:AUTOMATIC DIFFERENTIATION WITH TORCH.AUTOGRAD

Loss It's scalar , Jacobian vector product ,JVP,
primitive operation( Primitive operation ):

torch.autograd(), Calculate the gradient
import torch
x = torch.ones(5) # input tensor, Input vector
print(f"x: {x}")
y = torch.zeros(3) # expected output, label
print(f"y: {y}")
w = torch.randn(5, 3, requires_grad=True) # Turn on automatic differentiation
print(f"w: {w}")
b = torch.randn(3, requires_grad=True)
print(f"b: {b}")
z = torch.matmul(x, w)+b
loss = torch.nn.functional.binary_cross_entropy_with_logits(z, y)
Use gradient back propagation algorithm ,back propagation
backward() yes Tensor Class method ,loss Is a scalar direct call backward(),loss If it's a tensor , be backward() Need to pass in tensor
print(f"Gradient function for z = {
z.grad_fn}")
print(f"Gradient function for loss = {
loss.grad_fn}")
loss.backward()
print(w.grad)
print(b.grad)
retain_graph=True, Keep the diagram , If not reserved , The first 2 This call will report an error :RuntimeError: Trying to backward through the graph a second time
torch.no_grad() Turn off automatic derivation :
z = torch.matmul(x, w)+b
print(z.requires_grad)
with torch.no_grad():
z = torch.matmul(x, w)+b
print(z.requires_grad)
z = z.detach() after ,z.requires_grad yes False
z = z.detach()
z.requires_grad
DAG:directed acyclic graph, Directed acyclic graph
tensor loss, Input torch.ones_like(inp), Back propagation
retain_graph Keep the diagram , Continuous backward()
Gradient set 0,np.grad.zero_()
inp = torch.eye(5, requires_grad=True)
out = (inp+1).pow(2)
print(out)
out.backward(torch.ones_like(inp), retain_graph=True)
print(f"First call\n{
inp.grad}")
out.backward(torch.ones_like(inp), retain_graph=True)
print(f"\nSecond call\n{
inp.grad}")
inp.grad.zero_()
out.backward(torch.ones_like(inp), retain_graph=True)
print(f"\nCall after zeroing gradients\n{
inp.grad}")
边栏推荐
猜你喜欢

Srtt-110vdc-4h-c time relay

Deepfacelab model parameters collection

Here comes the full open source free customer service system

【LeetCode】35、搜索插入位置

Crmeb Standard Version window+phpstudy8 installation tutorial (I)

Chrome plug-in debugging

JWY-32B电压继电器

Slider restore and validation (legal database)

MPLS LDP的原理与配置

No files or folders found to process
随机推荐
配置cx-oracle 解决(cx_Oracle.DatabaseError) DPI-1047: Cannot locate a 64-bit Oracle Client library: “Th
HJS-DE1/2时间继电器
3477. 简单排序
About the reptile thing
How to set some app application icons on the iPhone Apple phone that you don't want others to see? How to hide the app application settings on the mobile desktop so that they can be used normally afte
3511. Water pouring problem
4518. 最低票价
Publish raspberry pie web page with cpolar (release of apache2 web page)
Voltage relay dy-28c
SRTT-110VDC-4H-C时间继电器
3477. Simple sorting
3540. Binary search tree
RY-D1/1电压继电器
Srtt-110vdc-4h-c time relay
Volatile principle
Customer service system attached to crmeb Standard Edition
Pyinstaller packages py as an EXE file
Introduction to grpc
Requses template
Crmeb Standard Edition window+phpstudy8 installation tutorial (III)