当前位置:网站首页>solver. Learning notes of prototxt file parameters

solver. Learning notes of prototxt file parameters

2022-07-05 07:47:00 Fall in love with wx

Reference resources :https://blog.csdn.net/xygl2009/article/details/77484522  Thank you very much !

# -*- coding: UTF-8 -*-
train_net: "examples/YOLO_Helmet/train.prototxt" # Training profile 
test_net: "examples/YOLO_Helmet/test.prototxt" # Test profile  
test_iter: 4952 # Represents the number of tests ; such as , Yours test Stage batchsize=100, And your test data is 10000 A picture , Then your test times are 10000/100=100 Time ; namely , Yours test_iter=100
test_interval: 2000 # It indicates how many times your network iterations are tested , You can set up a generation of online training , Just do a test .
base_lr: 0.0005 # Indicates the basic learning rate , In the process of parameter gradient descent optimization , The learning rate will be adjusted , And the adjustment strategy can be passed lr_policy This parameter is set ;
display: 10
max_iter: 120000 # Maximum number of iterations 
lr_policy: "multistep" # Set to multistep, You also need to set a stepvalue. The sum of this parameter step Very similar ,step It's a uniform, equidistant change , and mult-step It is based on stepvalue Value change 
gamma: 0.5
weight_decay: 0.00005 # Indicates weight attenuation , Used to prevent over fitting 
snapshot: 2000 # Save model interval 
snapshot_prefix: "models/MobileNet/mobilenet_yolov3_deploy" # Save the prefix of the model 
solver_mode: GPU # Whether to use GPU
debug_info: false
snapshot_after_train: true
test_initialization: false
average_loss: 10 # Take multiple times foward Of loss Average , Display output 
stepvalue: 25000
stepvalue: 50000
stepvalue: 75000
iter_size: 4 # Handle batchsize*itersize After a picture , Just once ApplyUpdate The function depends on the learning rate 、method(SGD、AdaSGD etc. ) Make a gradient descent 
type: "RMSProp"
eval_type: "detection"
ap_version: "11point"
show_per_class_result: true

 

原网站

版权声明
本文为[Fall in love with wx]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202140550343547.html