当前位置:网站首页>The role of each layer in convolutional neural network
The role of each layer in convolutional neural network
2022-06-29 15:39:00 【Maomaozhen nice】
A complete neural network model often contains convolution layer 、 Pooling layer 、 Fully connected layer 、 Output layer . But the specific role and meaning of each layer , Sometimes I don't understand myself , Here is the relevant information , Take a brief note of , Hope to be helpful to relevant students .
1. Convolution layer
The neurons of the convolution layer feature map are connected with some neurons of the previous layer feature map through a group of filters or weight matrices , The locally connected region is also called the acceptance domain . After convolution of the neuron above the receiving threshold and the weight matrix , The characteristic diagram of this layer is generated through nonlinear excitation , And as input to the next level . Convolution , All accepted domains on the same characteristic graph share a set of weight matrices , Called weight sharing . Different weight matrices are used for different characteristic graphs of the same layer network , The number of characteristic graphs can also be understood as the number of channels . Each set of weight matrix detects the specific characteristics of the input data , therefore , Each feature map represents a specific feature at different locations of the previous layer . One of the advantages of local connection and weight sharing greatly reduces the free parameters of the network , To some extent, the over fitting of the network is avoided , At the same time, reduce the storage capacity . This convolution structure is based on the spatial correlation of image data and the displacement invariance of target features . let me put it another way , If a feature appears in a part of the image , Then it can also appear anywhere else . This explains why neurons at different locations share a weight matrix to detect image features .
2. Pooling layer
The function of the pooling layer is to fuse the same features detected by the convolution layer . Each characteristic graph of convolution layer is divided into several local slices , The pooling function calculates the statistical characteristics of each slice . The number of characteristic graphs of pooling layer is the same as that of convolution layer . The two common pooling methods are maximum pooling and average pooling , That is, take the maximum or average value of the slice as the input of the pool layer characteristic graph . therefore , The pooling layer is also called the lower sampling layer . In addition to reducing the dimension of the feature graph , Pooling is invariant to small-scale displacement and distortion of features . When the focus is on the feature itself rather than its location , Displacement invariance is a good characteristic .
3. Fully connected layer
The top of a convolutional neural network usually contains one or more fully connected layers . Different from convolution layer , The neurons of the whole connective layer line up , These neurons are interconnected with the previous layer of neurons through weights , In a fully connected structure . The number of layers of the whole connection layer and the number of neurons in each layer are not fixed . Usually the higher the number of layers , The fewer neurons .
4. Output layer
After multi-layer feature extraction , The last output layer can be regarded as a classifier , Predict the category of input samples .
notes : The article is excerpted from 《 Synthetic aperture radar intelligent interpretation 》 Xu Fengzhu
边栏推荐
- Building SQL statements in Excel
- js获取上个月第一天以及本月的最后一天
- NFS configuring file mapping between two hosts
- Sofaregistry source code | data synchronization module analysis
- PostgreSQL source code learning (24) -- transaction log ⑤ - log writing to wal buffer
- 11.应用层数据传输格式/端口号-bite
- Chapter IX app project test (the end of this chapter)
- 动作捕捉系统用于苹果采摘机器人
- BFD principle and configuration
- TDesign, which gave us benefits last time, will tell us its open source story today
猜你喜欢

数字图像处理复习

Chapter IX app project test (the end of this chapter)

Knowledge points: what are the know-how of PCB wiring?

Uncover the practice of Baidu intelligent test in the field of automatic test execution

Differential equations of satellite motion

89.(cesium篇)cesium聚合图(自定义图片)

Sofaregistry source code | data synchronization module analysis

Excel中构建SQL语句

postgresql源码学习(24)—— 事务日志⑤-日志写入WAL Buffer

File常用工具类, 流相关运用 (记录)
随机推荐
TDesign, which gave us benefits last time, will tell us its open source story today
BioVendor游离轻链(κ和λ)Elisa 试剂盒的化学性质
上次给我们发福利的 TDesign ,今天讲讲它的开源故事
雷达基本组成
13.TCP-bite
swift JSONSerialization
Lumiprobe 活性染料丨氨基染料:花青5胺
C SQLite class library
Basic use of text preprocessing library Spacy (quick start)
LeetCode-1188. 设计有限阻塞队列
这是少了什么依赖嘛?FlinkSql打包运行的时候报错,但是本地idea跑的时候是没问题的,求解,谢
89.(cesium篇)cesium聚合图(自定义图片)
PostgreSQL source code learning (25) -- transaction log ⑥ - wait for log writing to complete
Classe d'outils commune de fichier, application liée au flux (enregistrement)
SSL V**技术原理
Leetcode notes: biweekly contest 81
JS will have variable promotion and function promotion
Lumiprobe reactive dye - amino dye: cyanine 5 amine
ImgUtil 图片处理工具类,文字提取,图片水印
动作捕捉系统用于苹果采摘机器人