当前位置:网站首页>Watermelon book -- Chapter 5 neural network
Watermelon book -- Chapter 5 neural network
2022-07-02 09:11:00 【Qigui】
Individuality signature : The most important part of the whole building is the foundation , The foundation is unstable , The earth trembled and the mountains swayed .
And to learn technology, we should lay a solid foundation , Pay attention to me , Take you to firm the foundation of the neighborhood of each plate .
Blog home page : Qigui's blog
It's not easy to create , Don't forget to hit three in a row when you pass by !!!
Focus on the author , Not only lucky , The future is more promising !!!Triple attack( Three strikes in a row ):Comment,Like and Collect--->Attention
Artificial neural network (ANN): Simulate the structure and function of human brain nervous system , An artificial network system composed of a large number of simple processing units and widely connected .
One 、 Neuron
A neuron usually has multiple dendrites . It is mainly used to receive incoming information , There is only one axon . Axons can transmit information to many other neurons . Axon terminals connect with dendrites , To send a signal , This connection corresponds to a weight ; The value of the weight is called the weight , This is something that needs training . namely Each connecting line corresponds to a different weight .

Neuron model Is a containing input , Models of output and computing functions . The input can be likened to the dendrites of neurons , The output can be compared to the axon of a neuron , The calculation can be compared to the nucleus . The most basic component of neural network is neuron model .
Again, connection is the most important thing in neurons , Each connection has a weight .
A training algorithm of neural network is to adjust the value of weight to the best , So that the prediction effect of the whole network is the best .

Which has been used until now is M-P Neuron model . In this model , Neurons receive information from n The input signals from these other neurons , These input signals are transmitted through the connection of weights , The total input received by the neuron is compared with the threshold of the neuron , And then through Activation function Processing to produce the output of neurons .
Next, it's about Calculation of neurons , Input It's been through Three step mathematical operation :
1. First enter multiply by weight (weight):x1-->x1 * w1;x2-->x2 * w2
2. Sum up :(x1 * w1) + (x2 * w2) + b
3. After the activation function processing, the output :y = f((x1 * w1) + (x2 *w2) +b)

Activation function :
Nonlinear function is introduced into neural network as activation function , It is no longer a linear combination of inputs , But almost any function .
The function of activation : Convert unlimited input into predictable output , The commonly used activation function is sigmoid function .
sigmoid Function squeezes input values that may vary over a wide range to (0,1) Output value range , So it is sometimes called ” Squeeze function “.
import matplotlib.pyplot as plt
import numpy as np
def sigmoid(x):
# Go straight back to sigmoid function
return 1 / (1 + np.exp(-x))
def plot_sigmoid():
# param: The starting point , End , spacing
x = np.arange(-8, 8, 0.1)
y = sigmoid(x)
plt.plot(x, y)
plt.show()
if __name__ == '__main__':
plot_sigmoid()
Two 、 neural network
Multilayer neural network , Not at all The more levels, the better .
How to build neural networks :
1. Building a neural network is to connect multiple neurons .
2. This neural network has 2 Inputs 、 A contain 2 A hidden layer of neurons (h1 and h2)、 contain 1 The output layer of neurons o1.
3. The hidden layer is the part sandwiched between the input layer and the output layer , A neural network can have multiple hidden layers .

3、 ... and 、 perceptron ( Reference resources 《 Statistical learning method 》) And multi tier Networks
The perceptron consists of two layers of neurons . Input layer After receiving the external input signal, it is transmitted to the output layer , Output layer yes M-P Neuron , Also known as ” Threshold logical unit “.
Perceptron is a linear classification model of binary classification problem . Single layer perceptron can only deal with linear problems , Can't handle nonlinear problems !!! The perceptron has only output layer neurons to process the activation function , That is, only one layer of functional neurons . To solve nonlinear problems , We need to consider using multi-layer functional neurons . The layer of neurons between the output layer and the input layer is called Hidden layer or hidden layer , Hidden layer and output layer neurons are functional neurons with activation function .

Each layer of neurons is fully interconnected with the next layer of neurons , There is no same layer connection between neurons , There is no cross layer connection , Such a neural network structure is usually called ” Multilayer feedforward neural network “.
Input layer neurons only receive input without performing functions beyond , The hidden layer and the output layer contain functional neurons , It's called ” Two layer network “; Just include the hidden layer , It can be called ” Multi layer network “.
Four 、 Error back propagation algorithm (BP Algorithm )
BP It's an iterative learning algorithm , In each iteration, the generalized perceptron learning rules are used to update the parameters .
Common activation function selection :sigmoid function 、tanh function 、ReLU function 、Leaky ReLU function .

Algorithm flow :
import numpy as np
def mse_loss(y_true, y_pre):
# y_true and y_pre It's the same length np Array
return ((y_true - y_pre) ** 2).mean()
Test code :
y_true = np.array([1, 0, 0, 1])
y_pre = np.array([0, 0, 0, 0])
print(mse_loss(y_true, y_pre)) # 0.5 Chapter six - Support vector machine (SVM)
http://t.csdn.cn/q6o2F
http://t.csdn.cn/q6o2F Chapter four - Decision tree http://t.csdn.cn/3Tme3
http://t.csdn.cn/3Tme3 The third chapter - Linear model http://t.csdn.cn/4S6Y6
http://t.csdn.cn/4S6Y6
边栏推荐
- 图像变换,转置
- 一篇详解带你再次重现《统计学习方法》——第二章、感知机模型
- Servlet全解:继承关系、生命周期、容器和请求转发与重定向等
- ORA-12514问题解决方法
- Minecraft module service opening
- Tensorflow2 keras classification model
- Cloudreve自建云盘实践,我说了没人能限制得了我的容量和速度
- Qt——如何在QWidget中设置阴影效果
- Jd.com interviewer asked: what is the difference between using on or where in the left join association table and conditions
- Matplotlib swordsman Tour - an artist tutorial to accommodate all rivers
猜你喜欢

知识点很细(代码有注释)数构(C语言)——第三章、栈和队列

Linux binary installation Oracle database 19C

数构(C语言--代码有注释)——第二章、线性表(更新版)

Matplotlib剑客行——初相识Matplotlib

Sentinel reports failed to fetch metric connection timeout and connection rejection

oracle修改数据库字符集

微服务实战|声明式服务调用OpenFeign实践

京东面试官问:LEFT JOIN关联表中用ON还是WHERE跟条件有什么区别

我服了,MySQL表500W行,居然有人不做分区?

概念到方法,绝了《统计学习方法》——第三章、k近邻法
随机推荐
Cloudrev self built cloud disk practice, I said that no one can limit my capacity and speed
Analysis and solution of a classical Joseph problem
选择排序和插入排序
commands out of sync. did you run multiple statements at once
队列的基本概念介绍以及典型应用示例
Cloudreve自建云盘实践,我说了没人能限制得了我的容量和速度
What is the future value of fluorite mine of karaqin Xinbao Mining Co., Ltd. under zhongang mining?
Pyspark de duplication dropduplicates, distinct; withColumn、lit、col; unionByName、groupBy
Taking the upgrade of ByteDance internal data catalog architecture as an example, talk about the performance optimization of business system
Win10 uses docker to pull the redis image and reports an error read only file system: unknown
C language implementation of mine sweeping game
There is a problem with MySQL installation (the service already exists)
C# 将网页保存为图片(利用WebBrowser)
C call system sound beep~
Linux binary installation Oracle database 19C
Qt的拖动事件
I've taken it. MySQL table 500W rows, but someone doesn't partition it?
Minecraft plug-in service opening
Data type case of machine learning -- using data to distinguish men and women based on Naive Bayesian method
微服务实战|原生态实现服务的发现与调用