当前位置:网站首页>【PYTORCH】RuntimeError: one of the variables needed for gradient computation has been

【PYTORCH】RuntimeError: one of the variables needed for gradient computation has been

2022-06-13 07:44:00 neu_ eddata_ yjzhang

The environment of this experiment is pytorch=1.4.0. The reason for this mistake is pytorch Automatic derivation mechanism , In seeking loss When , Because of my loss It has been written.

loss = loss_0 + loss_1

This may be right pytorch The automatic derivation mechanism of has caused confusion , So I changed the above sentence into the following code , The program can be executed normally , But whether the specific effect is the same as the above , Not sure yet .

loss = loss_0.backward() + loss_1.backward() 

Another way is that I put the original loss.backward() This line of code is changed to
loss1 = loss.detach_().requires_grad_(True)
loss1.backward() 

The mistake is solved

原网站

版权声明
本文为[neu_ eddata_ yjzhang]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202270547027673.html