当前位置:网站首页>Feature scaling normalization

Feature scaling normalization

2022-07-04 22:44:00 Melody2050

Purpose of feature scaling

For most machine learning algorithms and optimization algorithms , Scaling the eigenvalues to the same interval can make the model with better performance .

for example :

(a) There are two different characteristics , The value range of the first feature is 110, The value range of the second feature is 110000. In gradient descent algorithm , The cost function is the least square error function , So when using gradient descent algorithm , The algorithm will obviously favor the second feature , Because its value range is larger .

(b)k Nearest neighbor algorithm , It uses European distance , It will also lead to its preference for the second feature . For decision trees and random forests as well XGboost In terms of algorithm , Feature scaling has little effect on them .

Common feature scaling methods

  1. Standardization . Subtract the mean , And divide it by the standard deviation
  2. normalization . Use min、max, Shrink the value to [0,1]
原网站

版权声明
本文为[Melody2050]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/185/202207042206125845.html