当前位置:网站首页>12. Neural network model
12. Neural network model
2022-07-26 00:18:00 【WuJiaYFN】
primary coverage :
- The reason why neural networks are introduced
- Introduction to neural network model
- Computational vectorization of Neural Networks
- The relationship between neural network and logistic regression algorithm
One 、 The reason why neural networks are introduced
- Both linear regression and logistic regression have such a disadvantage , namely : When there are too many features , The calculated load will be very large .
- Most machine learning involves many features , For nonlinear classification problems , Often need to construct polynomial To represent the relationship between data , Ordinary logistic regression model , Can't handle so many features effectively , Now we need neural networks
Two 、 Introduction to neural network model
2.1 A single neuron model

- In the neural network , The small yellow circles in the figure are called with sigmoid perhaps logistic Activate the Artificial neuron
- x_0 node , It is called bias unit or bias neuron ,x_0 Always =1, You can draw or not draw , It mainly depends on how to be more convenient in specific examples
- Another term of activation function in neural network refers to nonlinear function g(z)
- θ The parameter is called the weight of the model in the neural network
2.2 Neural network model
A neural network is a group of neurons , Connected collections

Neural network has three layers :
- The first input layer —— Input characteristics x_1,x_2…
- The second hidden layer —— I can't see it in the training set , Its value is not x Neither y , There may be more than one hidden layer , Any non input layer and non output layer are called hidden layers
- The third output layer —— Output the final calculation result of the hypothesis h_θ(x)
Calculation method of neural network model :

Activation term : It refers to the value calculated and output by a specific neuron
θ^(j): Weight matrices , Control from j Layer to j+1 Mapping of layers
For example, in the figure :θ^(1) The parameter matrix that controls the mapping from three input units to three hidden units , It's a 3 x 4 Matrix
If a neural network is in the j Layer has a s_j A unit , stay j+1 Layer has a s_j +1 A unit , that θ^(j) That is, it controls the j layer To j+1 Layer mapping ,θ^(j) For one s_(j+1) X (s_j + 1) Matrix
The mathematical definition of the hypothetical function of neural network :
Neural networks define functions h_θ(x) From input x To output y Mapping , These hypothetical functions are parameterized , The parameter is recorded as θ, So just change θ, You can get different hypothetical functions
3、 ... and 、 Computational vectorization of Neural Networks

- For the convenience of representation and calculation , Input in the input layer x , It can be regarded as the activation value of the first layer , That is to say a^(1) = x
- Input from a particular neuron x_1,x_2,x_3 The weighted linear combination of the other is z
- a_0^(2) —— It is called a bias unit , be equal to 1
Four 、 The relationship between neural network and logistic regression algorithm

- In the neural network , In fact, the latter part is a logistic regression algorithm
- But neural network does not directly use the original x_1,x_2, … x_n As an input feature , Instead, it uses the learned function input value a1,a2,…an As an input feature
But neural network does not directly use the original x_1,x_2, … x_n As an input feature , Instead, it uses the learned function input value a1,a2,…an As an input feature - Neural networks do not use input features x_1,x_2, … x_n To train logical regression , Instead, use what you learned a1,a2,…an To train the logical model by yourself , Thus, we can get complex and diverse nonlinear hypothetical functions
If you think the article is good , You can give me some praise and encouragement
Pay attention to me , Let's study together , Progress together !!!
边栏推荐
- NVIDIA cudnn learning
- FreeRTOS personal notes - mutex
- 试除法--3的幂
- Leetcode high frequency question 66. add one, give you an array to represent numbers, then add one to return the result
- Solidity智能合约开发 — 3.2-solidity语法数组、结构体、映射
- Jd.com API for obtaining recommended product list
- 对“DOF: A Demand-oriented Framework for ImageDenoising“的理解
- FreeRTOS个人笔记-互斥量
- Compile live555 with vs2019 in win10
- The bull market is not over yet, and there is still 2021-05-18 in the second half
猜你喜欢

Representation and implementation of stack (C language)

Redirection and request forwarding

复盘:推荐系统—— 负采样策略

Backtracking - 77. combination

C语言 预处理详解

初阶C语言 - 分支语句(if、switch)

"Animal coin" is fierce, trap or opportunity? 2021-05-12

URL address mapping configuration

After using MQ message oriented middleware, I began to regret

NVIDIA可编程推理加速器TensorRT学习笔记(三)——加速推理
随机推荐
Shib (firewood Dog Coin) rose hundreds of times in January. What core elements does a hundred times coin need? 2021-05-09
计算物理期刊修改
“动物币”凶猛,陷阱还是机遇?2021-05-12
Pytoch learning record (I): introduction to pytoch
Duplicate disk: recommended system - negative sampling strategy
CountDownLatch
Redirection and request forwarding
How does the server build a virtual host?
这一次,彻底弄懂 Promise 原理
栈的表示和实现(C语言)
Leetcode169 detailed explanation of most elements
软件测试同行评审到底是什么?
SHIB(柴犬币)一月涨幅数百倍,百倍币需具备哪些核心要素?2021-05-09
Binary tree - 617. Merge binary tree
Sliding window_
What does it mean that the web server stops responding?
MySQL——多版本并发控制(MVCC)
appium 从启动到测试再到结束流程梳理
关于“DBDnet: A Deep Boosting Strategy for ImageDenoising“一文理解
Niuke / Luogu - [noip2003 popularization group] stack