当前位置:网站首页>Mxnet implementation of googlenet (parallel connection network)
Mxnet implementation of googlenet (parallel connection network)
2022-07-04 18:44:00 【Yinque Guangqian】
Address of thesis :Going deeper with convolutions
With in AI The publication of new papers on , We can see the development of Neural Networks , More and more people are studying the structure of human brain , From the initial perceptron to the full connection layer structure ( Dense structure ), Then to convolutional neural network ( Sparse structure ), Especially as the network becomes deeper and more complex , It will highlight the advantages of sparse structure , Why? , Because the connection between neurons in the human brain is a sparse structure , Similar to Heb's theory . among “neurons that fire together, wire together” Neurons activate together ( discharge ), Connect together , This is very interesting , in other words , Neurons in the brain rely on discharge to transmit signals , If different neurons discharge frequently at the same time , Then the connection between them will be closer .
The design of the model in this paper follows the practical intuition , That is, visual information should be processed on different scales and then aggregated , In order to abstract features from different scales at the same time in the next stage . The conclusion also points out that through It is a feasible method to improve computer vision neural network to approximate the desired optimal sparse result with easily available dense building blocks , That is to say GoogLeNet“ Networks with parallel connections ” The meaning of the model , Because such a new idea breaks the previous practice of series and deep , Develop in a more sparse direction , I think this is the article paper The most important value .
Let's look at two pictures first , Intuitive feeling , a sheet Inception modular , The other is GoogLeNet Model , because GoogLeNet The number of layers of the model is relatively deep , Avoid too big picture size , Pay attention to the direction I draw the arrow and the modules that use color discrimination .


Inception modular
The picture above shows , The core part is Inception modular , There are 4 Groups of parallel lines , The code implementation is as follows :
import d2lzh as d2l
from mxnet import gluon,init,nd
from mxnet.gluon import nn
# Four parallel lines , Then connect in the channel dimension
class Inception(nn.Block):
def __init__(self,c1,c2,c3,c4,**kwargs):
super(Inception,self).__init__(**kwargs)
# line 1
self.p1=nn.Conv2D(c1,kernel_size=1,activation='relu')
# line 2
self.p2_1=nn.Conv2D(c2[0],kernel_size=1,activation='relu')
self.p2_2=nn.Conv2D(c2[1],kernel_size=3,padding=1,activation='relu')
# line 3
self.p3_1=nn.Conv2D(c3[0],kernel_size=1,activation='relu')
self.p3_2=nn.Conv2D(c3[1],kernel_size=5,padding=2,activation='relu')
# line 4
self.p4_1=nn.MaxPool2D(pool_size=3,strides=1,padding=1)
self.p4_2=nn.Conv2D(c4,kernel_size=1,activation='relu')
def forward(self,x):
p1=self.p1(x)
p2=self.p2_2(self.p2_1(x))
p3=self.p3_2(self.p3_1(x))
p4=self.p4_2(self.p4_1(x))
return nd.concat(p1,p2,p3,p4,dim=1)# Connect with channel dimension structure GoogLeNet The whole model
# Five modules
B1=nn.Sequential()
B1.add(nn.Conv2D(64,kernel_size=7,strides=2,padding=3,activation='relu'),
nn.MaxPool2D(pool_size=3,strides=2,padding=1))
B2=nn.Sequential()
B2.add(nn.Conv2D(64,kernel_size=1,activation='relu'),
nn.Conv2D(192,kernel_size=3,padding=1,activation='relu'),
nn.MaxPool2D(pool_size=3,strides=2,padding=1))
B3=nn.Sequential()
B3.add(Inception(64,(96,128),(16,32),32),
Inception(128,(128,192),(32,96),64),# Number of output channels 128+192+96+64=480
nn.MaxPool2D(pool_size=3,strides=2,padding=1))
B4=nn.Sequential()
B4.add(Inception(192,(96,208),(16,48),64),
Inception(160,(112,224),(24,64),64),
Inception(128,(128,256),(24,64),64),
Inception(112,(144,288),(32,64),64),
Inception(256,(160,320),(32,128),128),
nn.MaxPool2D(pool_size=3,strides=2,padding=1))
B5=nn.Sequential()
B5.add(Inception(256,(160,320),(32,128),(128)),
Inception(384,(192,384),(48,128),128),
nn.GlobalAvgPool2D())
net=nn.Sequential()
net.add(B1,B2,B3,B4,B5,nn.Dense(10))
# View the output shape of each layer
X=nd.random.uniform(shape=(1,1,96,96))
net.initialize()
for layer in net:
X=layer(X)
print(layer.name,' Shape of the output :',X.shape)
'''
sequential0 Shape of the output : (1, 64, 24, 24)
sequential1 Shape of the output : (1, 192, 12, 12)
sequential2 Shape of the output : (1, 480, 6, 6)
sequential3 Shape of the output : (1, 832, 3, 3)
sequential4 Shape of the output : (1, 1024, 1, 1)
dense0 Shape of the output : (1, 10)
'''Training models
Limited by GPU, Or to Fashion-MNIST Data sets, for example , Note that paying attention to the new features of this network model is the focus of learning .
lr,num_epochs,batch_size,ctx=0.1,5,128,d2l.try_gpu()
net.initialize(force_reinit=True,ctx=ctx,init=init.Xavier())
trainer=gluon.Trainer(net.collect_params(),'sgd',{'learning_rate':lr})
train_iter,test_iter=d2l.load_data_fashion_mnist(batch_size,resize=96)
d2l.train_ch5(net,train_iter,test_iter,batch_size,trainer,ctx,num_epochs)
'''
epoch 1, loss 2.1157, train acc 0.210, test acc 0.511, time 154.6 sec
epoch 2, loss 0.8424, train acc 0.666, test acc 0.782, time 143.6 sec
epoch 3, loss 0.5345, train acc 0.802, test acc 0.847, time 143.9 sec
epoch 4, loss 0.4107, train acc 0.846, test acc 0.870, time 144.0 sec
epoch 5, loss 0.3557, train acc 0.865, test acc 0.875, time 142.4 sec
'''边栏推荐
- 【211】go 处理excel的库的详细文档
- Lua emmylua annotation details
- NBA赛事直播超清画质背后:阿里云视频云「窄带高清2.0」技术深度解读
- Nature Microbiology | 可感染阿斯加德古菌的六种深海沉积物中的病毒基因组
- Li Kou brush question diary /day7/2022.6.29
- Load test practice of pingcode performance test
- MXNet对GoogLeNet的实现(并行连结网络)
- 比李嘉诚还有钱的币圈大佬,刚在沙特买了楼
- TCP两次挥手,你见过吗?那四次握手呢?
- Summary of subsidy policies across the country for dcmm certification in 2022
猜你喜欢
![[cloud native] what is the](/img/00/0cb0f38bf3eb5dad02b3bc4ead36ba.jpg)
[cloud native] what is the "grid" of service grid?

Crawler (6) - Web page data parsing (2) | the use of beautifulsoup4 in Crawlers

TCP waves twice, have you seen it? What about four handshakes?

表情包坑惨职场人

. Net ORM framework hisql practice - Chapter 2 - using hisql to realize menu management (add, delete, modify and check)

力扣刷题日记/day3/2022.6.25
![[HCIA continuous update] WAN technology](/img/31/8e9ed888d22b15eda5ddcda9b8869b.png)
[HCIA continuous update] WAN technology

The controversial line of energy replenishment: will fast charging lead to reunification?

Halcon模板匹配

ISO27001认证办理流程及2022年补贴政策汇总
随机推荐
力扣刷题日记/day6/6.28
MXNet对GoogLeNet的实现(并行连结网络)
ESP32-C3入门教程 问题篇⑫——undefined reference to rom_temp_to_power, in function phy_get_romfunc_addr
Li Kou brush question diary /day7/6.30
Li Kou brush question diary /day6/6.28
华为云ModelArts的使用教程(附详细图解)
【2022年江西省研究生数学建模】水汽过饱和的核化除霾 思路分析及代码实现
TCP两次挥手,你见过吗?那四次握手呢?
Scala基础教程--16--泛型
Digital "new" operation and maintenance of energy industry
Just today, four experts from HSBC gathered to discuss the problems of bank core system transformation, migration and reconstruction
TorchDrug教程
Neglected problem: test environment configuration management
【211】go 处理excel的库的详细文档
Li Kou brush question diary /day8/7.1
[210] usage of PHP delimiter
ITSS运维能力成熟度分级详解|一文搞清ITSS证书
Weima, which is going to be listed, still can't give Baidu confidence
How to download files using WGet and curl
Deleting nodes in binary search tree