当前位置:网站首页>7. Regularization application
7. Regularization application
2022-07-08 01:01:00 【booze-J】
One 、 Application of regularization
stay 6.Dropout application Unused in Dropout Add regularization to the network model construction of code .
take 6.Dropout In application
# Creating models Input 784 Neurons , Output 10 Neurons
model = Sequential([
# Define output yes 200 Input is 784, Set offset to 1, add to softmax Activation function The first hidden layer has 200 Neurons
Dense(units=200,input_dim=784,bias_initializer='one',activation="tanh"),
# The second hidden layer has 100 Neurons
Dense(units=100,bias_initializer='one',activation="tanh"),
Dense(units=10,bias_initializer='one',activation="softmax")
])
It is amended as follows
# Creating models Input 784 Neurons , Output 10 Neurons
model = Sequential([
# Define output yes 200 Input is 784, Set offset to 1, add to softmax Activation function The first hidden layer has 200 Neurons
Dense(units=200,input_dim=784,bias_initializer='one',activation="tanh",kernel_regularizer=l2(0.0003)),
# The second hidden layer has 100 Neurons
Dense(units=100,bias_initializer='one',activation="tanh",kernel_regularizer=l2(0.0003)),
Dense(units=10,bias_initializer='one',activation="softmax",kernel_regularizer=l2(0.0003))
])
Use l2 Before regularization, you need to import from keras.regularizers import l2
.
Running results :
It can be seen from the running results that some over fitting conditions have been obviously overcome , The model is not very complex for data sets , With regularization , Its effect may not be very good .
Complete code
The code running platform is jupyter-notebook, Code blocks in the article , According to jupyter-notebook Written in the order of division in , Run article code , Glue directly into jupyter-notebook that will do .
1. Import third-party library
import numpy as np
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense
from tensorflow.keras.optimizers import SGD
from keras.regularizers import l2
2. Loading data and data preprocessing
# Load data
(x_train,y_train),(x_test,y_test) = mnist.load_data()
# (60000, 28, 28)
print("x_shape:\n",x_train.shape)
# (60000,) Not yet one-hot code You need to operate by yourself later
print("y_shape:\n",y_train.shape)
# (60000, 28, 28) -> (60000,784) reshape() Middle parameter filling -1 Parameter results can be calculated automatically Divide 255.0 To normalize
x_train = x_train.reshape(x_train.shape[0],-1)/255.0
x_test = x_test.reshape(x_test.shape[0],-1)/255.0
# in one hot Format
y_train = np_utils.to_categorical(y_train,num_classes=10)
y_test = np_utils.to_categorical(y_test,num_classes=10)
3. Training models
# Creating models Input 784 Neurons , Output 10 Neurons
model = Sequential([
# Define output yes 200 Input is 784, Set offset to 1, add to softmax Activation function The first hidden layer has 200 Neurons
Dense(units=200,input_dim=784,bias_initializer='one',activation="tanh",kernel_regularizer=l2(0.0003)),
# The second hidden layer has 100 Neurons
Dense(units=100,bias_initializer='one',activation="tanh",kernel_regularizer=l2(0.0003)),
Dense(units=10,bias_initializer='one',activation="softmax",kernel_regularizer=l2(0.0003))
])
# Define optimizer
sgd = SGD(lr=0.2)
# Define optimizer ,loss_function, The accuracy of calculation during training
model.compile(
optimizer=sgd,
loss="categorical_crossentropy",
metrics=['accuracy']
)
# Training models
model.fit(x_train,y_train,batch_size=32,epochs=10)
# Evaluation model
# Test set loss And accuracy
loss,accuracy = model.evaluate(x_test,y_test)
print("\ntest loss",loss)
print("test_accuracy:",accuracy)
# Training set loss And accuracy
loss,accuracy = model.evaluate(x_train,y_train)
print("\ntrain loss",loss)
print("train_accuracy:",accuracy)
边栏推荐
- Basic mode of service mesh
- 133. 克隆图
- STL--String类的常用功能复写
- 【GO记录】从零开始GO语言——用GO语言做一个示波器(一)GO语言基础
- 攻防演练中沙盘推演的4个阶段
- Binder core API
- Summary of the third course of weidongshan
- Cascade-LSTM: A Tree-Structured Neural Classifier for Detecting Misinformation Cascades(KDD20)
- Langchao Yunxi distributed database tracing (II) -- source code analysis
- Basic types of 100 questions for basic grammar of Niuke
猜你喜欢
AI zhetianchuan ml novice decision tree
1293_ Implementation analysis of xtask resumeall() interface in FreeRTOS
Interface test advanced interface script use - apipost (pre / post execution script)
Class head up rate detection based on face recognition
1.线性回归
Redis, do you understand the list
ReentrantLock 公平锁源码 第0篇
12.RNN应用于手写数字识别
C # generics and performance comparison
[go record] start go language from scratch -- make an oscilloscope with go language (I) go language foundation
随机推荐
第四期SFO销毁,Starfish OS如何对SFO价值赋能?
[Yugong series] go teaching course 006 in July 2022 - automatic derivation of types and input and output
韦东山第二期课程内容概要
What has happened from server to cloud hosting?
5G NR 系统消息
Class head up rate detection based on face recognition
German prime minister says Ukraine will not receive "NATO style" security guarantee
韦东山第三期课程内容概要
Service mesh introduction, istio overview
Application practice | the efficiency of the data warehouse system has been comprehensively improved! Data warehouse construction based on Apache Doris in Tongcheng digital Department
Service Mesh的基本模式
Semantic segmentation model base segmentation_ models_ Detailed introduction to pytorch
letcode43:字符串相乘
22年秋招心得
[OBS] the official configuration is use_ GPU_ Priority effect is true
133. 克隆图
New library launched | cnopendata China Time-honored enterprise directory
Fofa attack and defense challenge record
网络模型的保存与读取
The weight of the product page of the second level classification is low. What if it is not included?