当前位置:网站首页>[loss function] entropy / relative entropy / cross entropy

[loss function] entropy / relative entropy / cross entropy

2022-07-08 01:21:00 Ice cream and Mousse Cake

Easy to understand but not precise :
entropy : All the information about a possibility
Relative entropy (KL The divergence ): The difference in the amount of information between the real distribution and the predicted distribution ( real - forecast ), The value of the novel is close to the truth .
Cross entropy : from kl Derived from the deformation of divergence formula , The first half of the formula is entropy, and the second half is cross entropy . Because the entropy in front is constant , Therefore, it is more convenient to calculate the loss directly with cross entropy when optimizing .

 After deformation kl The divergence Blog records are convenient for review and sorting , If there is any mistake, please advise , thank you !

reference

https://blog.csdn.net/tsyccnh/article/details/79163834

原网站

版权声明
本文为[Ice cream and Mousse Cake]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202130543292799.html