当前位置:网站首页>keras model.compile Loss function and optimizer

keras model.compile Loss function and optimizer

2020-11-06 01:22:00 Elementary school students in IT field

Loss function

summary

Loss function is the goal of model optimization , So it's also called objective function 、 Optimize the scoring function , stay keras in , Parameters for model compilation loss Class of loss function specified , There are two ways of specifying :

model.compile(loss='mean_squared_error', optimizer='sgd')

perhaps

from keras import losses
model.compile(loss=losses.mean_squared_error, optimizer='sgd')

Available loss function

Available loss objective function :

mean_squared_error or mse
mean_absolute_error or mae
mean_absolute_percentage_error or mape
mean_squared_logarithmic_error or msle
squared_hinge
hinge
categorical_hinge
binary_crossentropy( Also called logarithmic loss ,logloss)
logcosh

categorical_crossentropy: Also known as multi class logarithmic loss , Note when using this objective function , The label needs to be transformed into a shape like (nb_samples, nb_classes) Binary sequence of 
sparse_categorical_crossentrop: Above , But accept sparse tags . Be careful , When using this function, you still need to have the same dimension as the output value , You may need to add a dimension to the tag data :np.expand_dims(y,-1)

kullback_leibler_divergence: From the probability distribution of predicted values Q To the truth probability distribution P Information gain of , To measure the difference between two distributions .

poisson: namely (predictions - targets * log(predictions)) The average of 

cosine_proximity: That is, the inverse number between the predicted value and the average cosine distance of the real label 

Loss function formula

https://zhuanlan.zhihu.com/p/34667893

Two classification - Report errors

On the loss function of the report error :
use Keras Do text classification , I always have mistakes like this ,

My category is 0 or 1, But the mistake told me it couldn't be 1.

See :Received a label value of 1 which is outside the valid range of [0, 1) - Python, Keras

loss function The problem of .

It used to be sparse_categorical_crossentropy,

Change it to binary_crossentropy Problem solving .

Optimizer

https://www.cnblogs.com/xiaobingqianrui/p/10756046.html

 WeChat ID

版权声明
本文为[Elementary school students in IT field]所创,转载请带上原文链接,感谢