当前位置:网站首页>Grouping convolution and DW convolution, residuals and inverted residuals, bottleneck and linearbottleneck

Grouping convolution and DW convolution, residuals and inverted residuals, bottleneck and linearbottleneck

2022-07-06 06:26:00 jq_ ninety-eight

Grouping convolution (Group Convolution)

Group convolution in ResNext Used in the

First of all, it must be clear :
Conventional convolution (Convolution) The parameter quantity of is :

K*K*C_in*n
K It's the size of the convolution kernel ,C_in yes input Of channel Count ,n Is the number of convolution kernels (output Of channel Count )

The parameter quantity of block convolution is :

K*K*C_in*n*1/g
K It's the size of the convolution kernel ,C_in yes input Of channel Count ,n Is the number of convolution kernels (output Of channel Count ),g Is the number of groups 

 Insert picture description here

DW(Depthwise Separable Conv)+PW(Pointwise Conv) Convolution

DW Convolution is also called deep separable convolution ,DW+PW The combination of MobileNet Used in

DW The parameter quantity of convolution is :

K*K*C_in ( here C_in = n)
K It's the size of the convolution kernel ,C_in yes input Of channel Count ,DW The convolution , The number of convolution kernels and input Of channel The same number 

PW The parameter quantity of convolution is :

1*1*C_in*n 
PW The convolution kernel of convolution is 1*1 size ,C_in yes input Of channel Count ,n Is the number of convolution kernels (output Of channel Count )

 Insert picture description here

summary

  • The parameter quantity of block convolution is conventional convolution (Convolution) Parameter quantity 1/g, among g Is the number of groups
  • DW The parameter quantity of convolution is conventional convolution (Convolution) Parameter quantity 1/n, among n Is the number of convolution kernels
  • When in packet convolution g=C_in, n=C_in when ,DW== Grouping convolution

Residuals And Inverted Residuals

 Insert picture description here

bottleneck And linearbottleneck

Bottleneck It refers to the bottleneck layer ,Bottleneck Structure is actually to reduce the number of parameters ,Bottleneck Three steps are first PW Dimensionality reduction of data , Then the convolution of conventional convolution kernel , Last PW Dimension upgrading of data ( Similar to the hourglass ).

The focus here is on health in the network ( l ) dimension -> Convolution -> l ( drop ) Dimensional structure , Rather than shortcut

Linear Bottlececk: in the light of MobileNet v2 Medium Inverted residual block The last of the structure 1*1 The convolution layer uses a linear activation function , instead of relu Activation function

原网站

版权声明
本文为[jq_ ninety-eight]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202132026144522.html