当前位置:网站首页>Random seed torch in deep learning manual_ seed(number)、torch. cuda. manual_ seed(number)
Random seed torch in deep learning manual_ seed(number)、torch. cuda. manual_ seed(number)
2022-07-01 03:33:00 【It's seventh uncle】
During the training model , You will encounter a lot of randomness settings , The results of setting randomization and multiple experiments are more convincing . But now more and more papers require the reproducibility of the model , At this time, we have to control the randomness of the code
And the initial weight is the same every time , It is beneficial to the comparison and improvement of the experiment
instructions : There is no direct relationship between random seed and neural network training , The function of random seeds is to produce random numbers with weights as initial conditions . The effect of neural network directly depends on the learning rate and the number of iterations .
To put it simply , The process of generating random numbers in a computer is not random , But its initial number ( seeds ) Is random . In deep learning ,( For example, deep neural network ) We often need to set the initial value of the superparameter in the network , Such as weight , Here we need to use some functions that generate random numbers , These functions are usually seeded manually , If the seed is set to the same , Then the initial weight is the same .
The best random seed should not be found , The existence of randomness is just used to evaluate the robustness of the model . An excellent model , Not because the random initial position is slightly different , And can't find the best location . This is the work that the model itself should resolve , Instead of choosing a random number .

Correlation function :
torch.manual_seed(number): by CPU Set seed in , Generate random number ;torch.cuda.manual_seed(number): For specific GPU Set seeds , Generate random number ;torch.cuda.manual_seed_all(number): For all GPU Set seeds , Generate random number ;
torch.manual_seed(1) It's about setting up CPU The random number of is fixed , Make the same run immediately .py Of documents rand() function ==【 Random function 】== The generated values are fixed and random !
But after setting random seeds , Is every run test.py The output of the file is the same , Instead of the same result every time a random function is generated :
# test.py
import torch
torch.manual_seed(0)
print(torch.rand(1))
print(torch.rand(1))
Output :
tensor([0.4963])
tensor([0.7682])
If you just want to run a random function every time, the result will be the same , As like as two peas, you can set the same random seeds before each random function. :
# test.py
import torch
torch.manual_seed(0)
print(torch.rand(1))
torch.manual_seed(0)
print(torch.rand(1))
Output :
tensor([0.4963])
tensor([0.4963])
Reference resources :【PyTorch】torch.manual_seed() Detailed explanation
边栏推荐
猜你喜欢

Hello World generation

复习专栏之---消息队列

监听器 Listener

The preorder traversal of leetcode 144 binary tree and the expansion of leetcode 114 binary tree into a linked list

Cookie&Session

Redis tutorial

Depth first traversal of C implementation Diagram -- non recursive code

Home online shopping project

CX5120控制汇川IS620N伺服报错E15解决方案

文件上传下载
随机推荐
How to achieve 0 error (s) and 0 warning (s) in keil5
C#实现图的深度优先遍历--非递归代码
[daily training] 1175 Prime permutation
岭回归和lasso回归
10、Scanner.next() 无法读取空格/indexOf -1
【伸手党福利】开发人员重装系统顺序
pytorch训练深度学习网络设置cuda指定的GPU可见
leetcode 1482 猜猜看啊,这道题目怎么二分?
ASGNet论文和代码解读2
打包iso文件的话,怎样使用hybrid格式输出?isohybrid:command not found
E15 solution for cx5120 controlling Huichuan is620n servo error
Ridge regression and lasso regression
Home online shopping project
Analyze datahub, a new generation metadata platform of 4.7K star
Filter
Explain spark operation mode in detail (local+standalone+yarn)
终极套娃 2.0 | 云原生交付的封装
How do spark tasks of 10W workers run? (Distributed Computing)
RSN:Learning to Exploit Long-term Relational Dependencies in Knowledge Graphs
过滤器 Filter