当前位置:网站首页>Chapter 6 promotion
Chapter 6 promotion
2022-07-28 13:18:00 【Sang zhiweiluo 0208】
1 The characteristics of random forest
The decision tree of random forest is established by sampling respectively , Relatively independent .
ps: By weak classifier ——> Strong classifier method : Sample weighting 、 Classifier weighting .
Sample weighting : For example, classify a sample , There will be samples with wrong classification , Increase its weight .
Classifier weighting : For weak classifiers with low misclassification rate , We give higher weight to the final result .
Weight refers to the predicted value .
2 promote
2.1 promote
promote —— Promotion is a machine learning technology , It can be used for regression and classification problems , It generates a weak prediction model at every step ( Such as the decision tree ), And weighted and accumulated into the total model .
Gradient rise —— If the weak prediction model of each step is generated according to the gradient direction of loss function , It's called gradient ascension .
The theoretical significance of promotion —— If a problem exists Weak classifier , You can get Strong Classifier .
2.2 Gradient lifting algorithm
The gradient lifting algorithm first gives a target loss function ( We give it according to the actual problems , It has nothing to do with ascension ), Its definition domain is all feasible Weak function set ( Base function ). The lifting algorithm selects one through iteration In the direction of negative gradient To gradually approach Local minima . This view of gradient lifting in function domain has a profound impact on many fields of machine learning .
2.3 Lifting algorithm
Given the input vector x And output variables y A number of training samples (x1,y1),(x2,y2),……(xn,yn), The goal is to find an approximate function
, Make the loss function L(y,F(x)) The loss value of is the smallest .
Loss function L(y,F(x)) The typical definition of is :
or 
Suppose the optimal function is
, namely :![F^{*}(\overrightarrow{x})=\underset{F}{arg min}E_{(x,y)}[L(y,F(\overrightarrow{x}))]](/img/36/b3aab1f24519e3db3a092a577ae950.gif)
Assume F(x) It's a family of basis functions
Weighted sum of 
prove : The median is the absolute minimum optimal solution .
Given the sample x1,x2,……,xn, Calculation 

Finding partial derivatives :
, Make it equal to 0.
Before getting k The number of samples is the same as that after n-k The number of samples is the same , namely
Is the median .
Lifting algorithm derivation :

Gradient approximation :

Lifting algorithm :
3 Gradient lift decision tree GBDT
3.1 Definition
3.2 summary
4 Objective function
4.1 Second order derivative information

4.2 Calculation of objective function
4.3 Simplification of objective function
5 Adaboost
5.1 Adaboost Definition
Set up training data set 
Initialize the weight distribution of training data :
,
5.2 Adaboost Algorithm


5.3 Illustrate with examples
m=1:
| Serial number | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | x |
| X | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
| Y | 1 | 1 | 1 | -1 | -1 | -1 | 1 | 1 | 1 | -1 |
The weight distribution is D1 On the training data , threshold v take 2.5 The time error rate is the lowest , So the basic classifier is 
We can see x=6、7、8 There is error in the data , So the error rate is 0.3, namely e1=0.3=3*0.1.
Plug in G1 The coefficient of :
∴
classifier
On the training data set, there are 3 A misclassification point .

Calculate weights 
You can see the point of error x=6、7、8 The weight of the . For the next basic classifier , namely m=2 when .
m=2:
| X | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
| Y | 1 | 1 | 1 | -1 | -1 | -1 | 1 | 1 | 1 | -1 |
| w | 0.0715 | 0.0715 | 0.0715 | 0.0715 | 0.0715 | 0.0715 | 0.1666 | 0.1666 | 0.1666 | 0.0715 |
The weight distribution is D2 On the training data , threshold v take 8.5 The time error rate is the lowest , So the basic classifier is 
You can see x=3、4、5 There is an error , So the error rate is 0.2143, namely e2=0.2143=0.0715*3
Plug in G1 The coefficient of 

classifier
On the training data set, there are 3 A misclassification point .
Calculate weights 
You can see x=3、4、5 The weight of the . For the next basic classifier , namely m=3 when .
m=3 By analogy ……
5.4 The key explanation of weight and error rate

边栏推荐
- The difference between sessionstorage, localstorage and cookies
- How many times can the WordPress user name be changed? Attach the method of changing user name
- Chinese translation of pointnet:deep learning on point sets for 3D classification and segmentation
- [graduation design teaching] ultrasonic ranging system based on single chip microcomputer - Internet of things embedded stm32
- Aragon创建DAO polygon BSC测试网
- [graduation design] heart rate detection system based on single chip microcomputer - STM32 embedded Internet of things
- Single option trading and cross trade
- .NET的求复杂类型集合的差集、交集、并集
- [embedded C foundation] Part 7: detailed introduction to C language process control
- UV germicidal lamp chip dlt8p65sa Jericho
猜你喜欢
![[graduation design] smart home system based on ZigBee - single chip microcomputer Internet of things stm32](/img/c3/4268d7e4e1429f9b0d9d928790d740.png)
[graduation design] smart home system based on ZigBee - single chip microcomputer Internet of things stm32

Change the document type in endnode and import it in word

Original juice multifunctional Juicer touch chip-dlt8t02s-jericho

Fast classification of array.group() in ES6

SSH port forwarding (Tunneling Technology)
![[July 5 event preview] Flink Summit](/img/d8/a367c26b51d9dbaf53bf4fe2a13917.png)
[July 5 event preview] Flink Summit

拥有游戏的一部分,写在我的世界禁用NFT之后
![[graduation design teaching] ultrasonic ranging system based on single chip microcomputer - Internet of things embedded stm32](/img/27/58fd175753b21dc21bd2d950cf5f15.png)
[graduation design teaching] ultrasonic ranging system based on single chip microcomputer - Internet of things embedded stm32

One track education, PHP training, unity of knowledge and practice, popular

Machine learning Basics - integrated learning-13
随机推荐
Brief introduction to JS operator
Ruan Bonan of Green Alliance Technology: cloud native security from the open source shooting range
[graduation design] smart home system based on ZigBee - single chip microcomputer Internet of things stm32
Sub thread update UI full solution
[FPGA] joint simulation of vivado and Modelsim
Android工程师,如何使用Kotlin提供生产力?
UV germicidal lamp chip dlt8p65sa Jericho
IP电话系统和VoIP系统使用指南
Led aquarium lamp touch chip-dlt8t02s-jericho
[embedded C foundation] Part 7: detailed introduction to C language process control
Leetcode 笔记 118. 杨辉三角
Unity—“合成大西瓜”小游戏笔记
什么是事务及数据库的优化方法
Single option trading and cross trade
How to improve deep learning performance?
Leetcode 笔记 566. 重塑矩阵
Intrinsic value and time value of options
【嵌入式C基础】第9篇:C语言指针的基本用法
Automatic light sensing arm lamp touch chip-dlt8sa15b-jericho
Remove the plug-in of category in WordPress link




