当前位置:网站首页>[note] logistic regression
[note] logistic regression
2022-07-27 14:07:00 【Sprite.Nym】
One 、 Overview of logistic regression
(1) The purpose of logistic regression : classification .
Logistic regression often solves classification problems , Especially the binary classification problem .
(2) The process of logistic regression : Return to .
The result is 0~1 Continuous values between , Represents the possibility of occurrence ( Class probability ).
(3) threshold : Complete the classification through the comparison of possibility and threshold .
Such as : Calculate the possibility of default , If it is greater than 0.5, The borrower is classified as a bad customer .
Two 、 Logistic regression model
Because in the second classification problem , Tags are only yes or no (1 and 0), If linear regression is used for fitting , The range of linear regression is not 0 ~ 1 Between , The prediction result is difficult to be output as possibility . And if we use piecewise function to fit , Because the piecewise function is not continuous , The predicted result is not the possibility we hope 0 ~ 1 Continuous values of .
The solution is to combine linear regression with sigmoid Functions together , That is to form a nested function .
sigmoid Function image :

Nested functions composed of :
3、 ... and 、 The loss function of logistic regression
If we directly combine y ^ \hat y y^ Replace with sigmoid Function as the loss function of logistic regression to find the minimum value , You will find that this function is not convex , Therefore, other loss functions are used .
(1) In the problem of classification y Probability
Due to agreement y ^ = P ( y = 1 ∣ x ) \hat y=P(y=1|x) y^=P(y=1∣x) , therefore :
y = 1 y=1 y=1 when , P ( y ∣ x ) = y ^ P(y|x)=\hat y P(y∣x)=y^
y = 0 y=0 y=0 when , P ( y ∣ x ) = 1 − y ^ P(y|x)=1-\hat y P(y∣x)=1−y^
Merge to get :
P ( y ∣ x ) = y ^ y ( 1 − y ^ ) ( 1 − y ) P(y|x)={\hat y}^y(1-\hat y)^{(1-y)} P(y∣x)=y^y(1−y^)(1−y)
(2) Using the maximum likelihood estimation method to estimate the model parameters
The likelihood function is :
The log likelihood function is :
Find the maximum value of the maximum likelihood function , The most ideal parameter value can be obtained . So for the whole training set , The cost function can be defined as :

边栏推荐
- Zoom, translation and rotation of OpenCV image
- 小程序毕设作品之微信校园洗衣小程序毕业设计成品(5)任务书
- Ncnn compilation and use pnnx compilation and use
- Wechat campus laundry applet graduation design finished product of applet completion work (8) graduation design thesis template
- 递归方法实现最大公约数
- 小程序毕设作品之微信校园洗衣小程序毕业设计成品(1)开发概要
- LeetCode·每日一题·592.分数加减运算·模拟
- 基于RoBERTa-wwm动态融合模型的中文电子病历命名实体识别
- 将目标检测大尺寸图片裁剪成固定尺寸图片
- 用命令如何返回上级目录
猜你喜欢

Good architecture is evolved, not designed

How to make computers have public IP

Interview eight part essay · TCP protocol

What are the benefits of taking NPDP

Mining enterprise association based on Enterprise Knowledge Map

平板模切机

In the "meta cosmic space", utonmos will open the digital world of the combination of virtual and real

NoSQL —— NoSQL 三大理论基石 —— CAP —— BASE—— 最终一致性

深度置信网络(DBN)【经典的DBN网络结构是由若干层 RBM(受限波尔兹曼机)和一层 BP 组成的一种深层神经网络】

基于企业知识图谱的企业关联关系挖掘
随机推荐
Is it still time to take the PMP Exam in September?
【idea】设置提取serialVersionUID
The salary level of programmers in various countries is a little miserable
Hcip - OSPF comprehensive experiment
Alibaba's latest equity exposure: Softbank holds 23.9% and caichongxin holds 1.4%
SQL tutorial: introduction to SQL aggregate functions
We should learn to check the documented instructions of technical details
软考 系统架构设计师 简明教程 | 软件系统建模
特征工程中的缩放和编码的方法总结
What open source projects of go language are worth learning
Recommended collection, confusing knowledge points of PMP challenge (2)
RSS tutorial: aggregate your own information collection channels, rshub, freshrss, NetNewsWire
Flat die cutting machine
文旅数藏 | 用艺术的方式游云南
Wechat campus laundry applet graduation design finished product of applet completion work (8) graduation design thesis template
JWT login expiration - automatic refresh token scheme introduction
[x for x in list_a if not np.isnan(x)]和[x if not np.isnan(x) else None for x in list_a]的区别
The most complete collection of strategies! Super AI painting tool midjourney open beta! Come and build your fantasy metauniverse
[training day4] sequence transformation [thinking]
井贤栋等蚂蚁集团高管不再担任阿里合伙人 确保独立决策