当前位置:网站首页>Loss function~
Loss function~
2022-07-02 23:03:00 【Miss chenshen】
Concept :
The loss function is the function used to calculate the difference between the tag value and the predicted value , In the process of machine learning , There are a variety of loss functions to choose from , A typical distance vector , Absolute vector, etc .
The diagram above is a schematic diagram for automatic learning of linear equations . Thick lines are real linear equations , The dotted line is a schematic diagram of the iterative process ,w1 Is the weight of the first iteration ,w2 Is the weight of the second iteration ,w3 Is the weight of the third iteration . As the number of iterations increases , Our goal is to make wn Infinitely close to the real value .
In the figure 1/2/3 The three labels are 3 Predict in the next iteration Y Value and reality Y The difference between values ( The difference here means the loss function , Yes, of course , There are many formulas for calculating the difference in practical application ), The difference here is represented by absolute difference on the diagram . There is also a square difference in multidimensional space , Mean square deviation and other different distance calculation formulas , That is, the loss function .
Common calculation methods of loss function :
1. nn.L1Loss Loss function
L1Loss The calculation method is very simple , Take the average of the absolute error between the predicted value and the real value .
criterion = nn.L1Loss()
loss = criterion(sample, target)
print(loss)
# 1
The final output is 1
Calculation logic is as follows :
- First calculate the sum of absolute differences :|0-1|+|1-1|+|2-1|+|3-1| = 4
- And then average :4/4 = 1
2. nn.SmoothL1Loss
SmoothL1Loss Also called Huber Loss, Error in (-1,1) It's the square loss , Other things are L1 Loss .
criterion = nn.SmoothL1Loss()
loss = criterion(sample, target)
print(loss)
# 0.625
The final output is 0.625
Calculation logic is as follows :
- First calculate the sum of absolute differences :
- And then average :2.5/4 = 0.625
3. nn.MSELoss
Square loss function . The calculation formula is the average of the sum of squares between the predicted value and the real value .
criterion = nn.MSELoss()
loss = criterion(sample, target)
print(loss)
# 1.5
The final output is 1.5
Calculation logic is as follows :
- First calculate the sum of absolute differences :
- And then average :6/4 = 1.5
4. nn.BCELoss
Cross entropy for binary classification , Its calculation formula is complex , Here is mainly a concept , In general, it won't be used .
criterion = nn.BCELoss()
loss = criterion(sample, target)
print(loss)
# -13.8155
边栏推荐
- 數據分析學習記錄--用EXCEL完成簡單的單因素方差分析
- easyclick,ec权朗网络验证源码
- World Environment Day | Chow Tai Fook serves wholeheartedly to promote carbon reduction and environmental protection
- [leetcode] reverse the word III in the string [557]
- Antd component upload uploads xlsx files and reads the contents of the files
- stop slave卡住--事务的事件没有复制完整
- Niuke network: maximum submatrix
- Performance optimization - rigorous mode
- Construction of Hisilicon 3559 universal platform: draw a frame on the captured YUV image
- Motivation du Protocole de chiffrement avancé AES
猜你喜欢
最小生成树 Minimum Spanning Tree
Methods to solve the tampering of Chrome browser and edeg browser homepage
Splunk audit setting
[favorite poems] OK, song
[npuctf2020]ezlogin XPath injection
odoo13搭建医院HRP环境(详细步骤)
设置单击右键可以选择用VS Code打开文件
Local dealers play the community group purchase mode and share millions of operations
數據分析學習記錄--用EXCEL完成簡單的單因素方差分析
【喜欢的诗词】好了歌
随机推荐
antd组件upload上传xlsx文件,并读取文件内容
2016. 增量元素之间的最大差值
Xiaopeng P7 had an accident and the airbag did not pop up. Is this normal?
[npuctf2020]ezlogin XPath injection
Jerry's built-in shutdown current is 1.2ua, and then it can't be turned on by long pressing [chapter]
Jatpack------LiveData
pytorch训练CPU占用持续增长(bug)
Go 4 modes Singleton
Data analysis learning records -- complete a simple one-way ANOVA with Excel
go 条件变量
Construction of Hisilicon 3559 universal platform: draw a frame on the captured YUV image
Golang的学习路线
psnr,ssim,rmse三个指标的定量分析
数据标注典型案例,景联文科技如何助力企业搭建数据方案
[leetcode] most elements [169]
景联文科技低价策略帮助AI企业降低模型训练成本
Lambda expression: an article takes you through
全面解析分享购商业模式逻辑?分享购是如何赋能企业
Motivation du Protocole de chiffrement avancé AES
[leetcode] number of palindromes [9]