当前位置:网站首页>TensorFlow2 study notes: 6. Overfitting and underfitting, and their mitigation solutions
TensorFlow2 study notes: 6. Overfitting and underfitting, and their mitigation solutions
2022-08-04 06:05:00 【Live up to [email protected]】
1. What is overfitting and underfitting
There are two most common results that may be encountered in both machine learning and deep learning modeling, one is called over-fitting (over-fitting)The other is called underfittingUnder-fitting.
Overfitting
Definition: Overfitting refers to the fact that the model fits the training data too well, which is reflected in the evaluation indicators, that is, the model performs very well on the training set, but in thePoor performance on test set and new data.In layman's terms, overfitting means that the model learns the data so thoroughly that it also learns the features of the noisy data, which will lead to inability to identify well in the later testing.The data, that is, cannot be classified correctly, and the generalization ability of the model is too poor.
Underfitting
Definition: Underfitting refers to the situation that the model does not perform well during training and prediction, which is reflected in the evaluation indicators, which is the performance of the model on the training set and test setNeither bad.Underfitting means that the model does not capture the data features well and cannot fit the data well.
Intuitive performance, as shown below:
Three Fit States in Regression Algorithms
Three Fit States in Classification Algorithms
2. Overfitting solution
- Cleaning data
- Increase the training set
- Use regularization
- Increase the regularization parameter
3. Underfitting solution
- Cleaning data
- Increase the training set
- Use regularization
- Increase the regularization parameter
4. Regularization and how to use it

- L1 regularization: sum the absolute values of all parameters w.There is a high probability that many parameters will become 0, so this method can reduce the complexity by sparse parameters (that is, reducing the number of parameters).
- L2 regularization: sum the squared absolute values of all parameters w.Make the parameter close to 0 but not 0, so this method can reduce the complexity by reducing the parameter value.Reduce overfitting due to noise in the dataset.

版权声明
本文为[Live up to [email protected]]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/216/202208040525327629.html
边栏推荐
- flink自定义轮询分区产生的问题
- npm install dependency error npm ERR! code ENOTFOUNDnpm ERR! syscall getaddrinfonpm ERR! errno ENOTFOUND
- MySQL事务详解(事务隔离级别、实现、MVCC、幻读问题)
- (十二)树--哈夫曼树
- TensorFlow2 study notes: 4. The first neural network model, iris classification
- 读研碎碎念
- flink on yarn指定第三方jar包
- (十)树的基础部分(二)
- Zend FrameWork RCE1
- thymeleaf中 th:href使用笔记
猜你喜欢
随机推荐
NFT市场以及如何打造一个NFT市场
【go语言入门笔记】12、指针
8.30难题留坑:计数器问题和素数等差数列问题
剑指 Offer 2022/7/11
oracle临时表与pg临时表的区别
Vulnhub:Sar-1
SQL练习 2022/7/4
NFT市场可二开开源系统
攻防世界MISC—MISCall
TensorFlow:tf.ConfigProto()与Session
Commons Collections2
Logistic Regression --- Introduction, API Introduction, Case: Cancer Classification Prediction, Classification Evaluation, and ROC Curve and AUC Metrics
(十四)平衡二叉树
TensorFlow2学习笔记:4、第一个神经网模型,鸢尾花分类
(九)哈希表
win云服务器搭建个人博客失败记录(wordpress,wamp)
自动化运维工具Ansible(7)roles
简单说Q-Q图;stats.probplot(QQ图)
剑指 Offer 20226/30
安卓连接mysql数据库,使用okhttp








