当前位置:网站首页>[Go through 4] 09-10_Classic network analysis
[Go through 4] 09-10_Classic network analysis
2022-08-05 05:25:00 【Mosu playing computer】
The early lessons
Didn't think how important is it set an alarm clock,Can the tie yesterday ordered a half past seven clock(Love to sleep1.5h*5),Everyday in the morning was awakened,Then tired don't think of,闹钟响了,Change the position to continue to sleep.Then I'm think of,Just clicked on netease cloud,《想去海边》The former play up,The whole people just wake up.Never had such a wonderful startMusic is really a good thing,Then take a shower,吃饭,leetcodeA daily copy,What today is a line segment tree,听都没听过,看了20Minutes antithesis,Then his play again,清空,Hit it again.(Do it again in the evening)
Deep learning courseware
计划
Read all the rest,Today does not play the game,没啥意思.Watching the courseware,And than of notes I took a few days ago,去看看视频(Listen to the teacher how to speak,对,Now cached directly,So because of the unstable network,Do I give my slack excuses)
jijidown缓存中,And glanced at course video comments section,You are full of praise to the teacher's course,In the mind a little looking forward to.
经典网络
nice呀,Is a group of familiar stranger,讲道理,I really the next step is to look at all these a classic network paper(So bosses with reading,应该不难吧)
The traverse these a few classic network
* AlexNet
* ZFNet
* VGG
* GoogleNet
* ResNet
AlexNet
竞赛
Introduce competition,By the competition leads toAlexnet
贡献
(2022年6月30日11:35:01 See the previous video,Now can understand it)
From the contribution can see,A few days ago to learn the concept of are the family for the first time to ask,什么 dropout,用relu,等等
The layer number of statistics
又到了,If don't speak here,I don't even know how to check my check知识点常识(For an explanation of the layer number of),所以是8层.(What makes the processing layer is not,那在kerasInside is not to lose one line of code to add,Not as convolution layer,Pooling into one when it's time to stand up,气抖冷)
The network structure concrete analytical
About the amount of calculation parameters and size of the,我搞不懂.When the choice again,Remember learning algorithm to calculate the complexity,I fled,To escape this time
Convolution will make the image size smaller,227-11可以理解,卷积核11*11,那/4 +1 是为啥呀(Calculation formula is such N=(W-F+2P)/S+1)
WIs the length of the original,F是卷积核大小,P是填充(这里为零),S为步长.
RGB 3维的. w +b 96个卷积核
池化层 降低尺寸
归一层,Neurons inhibit those who do not give power,But look at the teacher's remarks,This layer is at the back with a passion(是的 2014年后 都不用了)
(真是吐了,Very not easy to learn meditation for a while,Suddenly appeared in front of a mosquito,On my monitor provoke me,一巴掌过去,飞走了..)
More of the convolution kernel Primitive learning ability
The benefits of the stacked convolution——感受野增大
重要技巧
(Don't know what time can I use these skills) 同时学习,In the general model can realize? dropout好像在sklearnInside the bag can choose,Gradient descent mode should also can choose,Vector manually cut should be in pursuit of extreme scores will do,Sample enhancement strategy~~(好像之前b站upWhat would a student made a application——Can be made to turn the picture,裁剪,放大放小,Add different filter and noise,好像是这个意思)~~ ,The last one is model incorporates
GPU
我用autodl的也
Part of the above to calculate,Part of the following is,And then be together.But now there are no doCan only choose aGPU多核,然后在pytorchCode parameter rules there -device “0 ,1”(And I haven't had two nuclear,This is also assumes that,Because my code to open agpu就绰绰有余了),After no operating space.What happened to him into twoGPU,保证每个GPUHalf of the neurons above.
Didn't understand the teacher high-centralized words. And the a3GB显存.I had no idea this is ah.Words six courage,Want to know something about computer hardware——CPU,Graphics is a what thing,找了好多教程,Didn't say understand,然后去b站 A machineup(Genius Zhao Dezhu,https://www.bilibili.com/video/BV1k64y1h7QP)那里,看明白了,But is limited to the then,The knowledge themselves don't have to,And my daily life is not too big relations,At ordinary times will not appear in the eyes,He is slowly forget.Now just remember one word,The graphics card is air.hhhhhhh
Then this thing also let me connect,Data mining course last semester about teacher,Said they were at the time,Run a model for several days,然后不行,Then adjustable parameter is for several days.Now than at the time(算力资源)好太多了.
Convolution layer in what to do
Yesterday is a characteristic of the courseware in figure,卷积核组,(Learning the texture?Texture is equal to the structure?无所谓了)
ZFNet
This compared with several other less familiar(Seems to have never heard of there),
The first step to feeling fine little things,Don't come up is very thick
Resolution is slowly down,Rather than a sudden drop
To the deep place already have signs of primitive combination,To give more resources,来记住
VGG
贡献
越深越好
A convolution kernel series can be too much than convolution kernels(相同,但更好)
它的输入和AlexNet相比,Direct took all pixel pointsR G B 均值,Instead of pressing dimension again to build to evaluate.
突然想到,The innovation point,都被称为(Also be)贡献.Also thanks to the wisdom of predecessors' crystallization and try.
Receptive field a word that is a bit familiar,但不知道.In front of the stage——Owned by a layer of no,So when layers increase the out.
The structure of the words number from the next to the line of
Shallow convolution kernels is because less primitives is what kind of —— 点线,Then the image is big,如果数量多的话,运算量太大了
但是通过可视化发现,Learning is the key behind the information,To allocate more convolution kernels,Otherwise you will not remember those concepts
小卷积核的优势【Considering the receptive field and the number of arguments】
Mention feel wild again,Have to check
https://blog.csdn.net/weixin_40756000/article/details/117264194
https://zhuanlan.zhihu.com/p/394917827
感受野(Receptive Field)的定义:卷积神经网络每一层输出的特征图(feature map)上的像素点映射回输入图像上的区域大小.通俗点的解释是,== 特征图上一点,相对于原图的大小,也是卷积神经网络特征所能看到输入图像的区域==.
然后 This blog also mentioned 卷积的计算公式,The teacher I can understand what I wrote above
#### About the structure design of convolution kernels number
主要的参数 The last convolution kernels areFCJoin here
GoogLeNet
创新点
Not only deeper,And wider,Because if only before processing layer of information,It is bound to cause some irreversible loss of information.Try to keep more information
直接把alexnet那里 77512 -》512
Where it comes from the innovation(From the perspective of the tandem structure)
Inception
Still very sigh,These structures is how to want to come out,Why do as much as possible to retain information,Will be designed to the structure,从0到1的过程.
来了
11D Is to retain the original space information(Because of the size also haven't changed ah W2=(W1-1)/1+1=W1)
33(Scale smaller) 55(尺度大一点的) Different feelings and wild to feel
33 max pooling 则是 感染,Its strong figure in those characteristics to other place.To strengthen the information
And then finally merge + + + ,Kept all levels
添加了11的卷积 Can I make the next layer of deep into the layer of convolution kernels number
The two is to reduce the computational complexity
This is both for compression
和v1的比较这是什么意思? In order to realize the above structure isv1This module is
(Inception 是理论,V1In the practice of real scenarios)
止于11:25,做了个饭,吃了个饭,Take a Courier,给电动车充电,Come back to bed,Want to lay down for a while,The teacher group chat message inside,Say want to test the project,And then to the needs and the teacher frank stage,In him then so long-winded words and flow chart of the current problems as,然后试图解决这些问题,时间一晃3个小时,人麻了,With the teacher said after.
问题就是:A monitoring algorithm,Want to use the database field(地址),But after database changes(Added other address),You must restart.I was like a year ago when undergraduate course in teacher worked,The first time.Just time to restart(人麻了,现在想起来,A year ago and I really not bad,Now still easily timing task)
Then the server restart,也是在autodlCalculate the force platform tried,The results make awhile(1) Say good only generates a thread will be blocked,The results I am crazy here is new,搞得cpu 500%的过载,直接把xshell和sshLink to dry is broken,Then the background,我没了xshell,Have shut and no way to,Hurriedly instance is shut down,Results after the reboot,See the instance monitoring,怎么还是500%,Don't take the somebody else platform screwed up,Scared I quickly released.Then add platform of enterprise WeChat said to context,人家说了“好的”,After put down the heart.
And the teacher said(问题)之后,The teacher feel my way(定时任务)不太行,Should be to monitor(触发器)数据库,然后再重启.(So the rest of the road is clear,Is monitoring the database+重启+服务器while(1){}永不间断)I algorithm deployment part even logic self-consistent,No matternice.Are your colleagues leftapp的事情了.
16:10Oh good tired ah,Work is to promote the million little.呜呜,Read the teacher lu's courseware also said today,Video have a look(Video is the cache good now),But oneself a courseware is watching(恼!).
When is now17:31 I will in bed watching the phone,Down under Tmall supermarket express(Mong tsai, milk and egg rolls)还有88Digital yuan coupons posted14Buy a box of maapp的102The two yuan yunnan watermelon(血亏,I see little sweet potato,They are both to offline store to buy the discount of the meat,甜点,买了一大堆(信息不对称!!))I would have bought two slap the size of a watermelon,102I dare to.My girlfriend said the Data analysis, correlation analysis,Feel this semester to learn things or have a little left in my head,实属不易.
晚上回来 Continue to see courseware,出去透透气.Is out to do this when the elder brother of the responsibility of,Eat eat at six to seven more,And brush the phone,Suddenly realize this point20:00.继续看
Two head gradient is a major point in order to make right there,Not to the back of the gradient is zero not.
Task in texture inside said
ResNet
The ball much ah,A model a,不过还好,Can I have this I learned a dopamine
The layer number of the more is not necessarily the good reasons
贡献
2015 resnetJust to ask 批归一化
残差结构
恒等映射,Even if a layer of doing nothing,Useful layer also won't be affected by their,Just performance will not come
反向梯度 +x (+1) From the guarantee is not zero
Of course also can have redundant 会拖慢速度
看了20分钟,看完了.
2022年6月30日18:29:50 Video notes completion
玩了1.5h
睡觉啦
Today it is rush to to learn,The results are a chapterppt的内容,Mainly in the afternoon to the teacher to do things,Then go to run around,精力跟不上了,Learn not go up at night.就这
边栏推荐
- DOM and its applications
- Analysis of Mvi Architecture
- RDD和DataFrame和Dataset
- Reverse theory knowledge 4
- Matplotlib(三)—— 实践
- 第5讲 使用pytorch实现线性回归
- [Go through 3] Convolution & Image Noise & Edge & Texture
- "PHP8 Beginner's Guide" A brief introduction to PHP
- 【过一下3】卷积&图像噪音&边缘&纹理
- Detailed Explanation of Redis Sentinel Mode Configuration File
猜你喜欢
将照片形式的纸质公章转化为电子公章(不需要下载ps)
机器学习(一) —— 机器学习基础
vscode+pytorch use experience record (personal record + irregular update)
Algorithms - ones and zeros (Kotlin)
OFDM Lecture 16 5 -Discrete Convolution, ISI and ICI on DMT/OFDM Systems
Pycharm中使用pip安装第三方库安装失败:“Non-zero exit code (2)“的解决方法
LeetCode: 1403. Minimum subsequence in non-increasing order [greedy]
Reverse theory knowledge 4
位运算符与逻辑运算符的区别
Mesos学习
随机推荐
ES6基础语法
day10-字符串作业
Opencv中,imag=cv2.cvtColor(imag,cv2.COLOR_BGR2GRAY) 报错:error:!_src.empty() in function ‘cv::cvtColor‘
el-table鼠标移入表格改变显示背景色
软件设计 实验四 桥接模式实验
Do you use tomatoes to supervise your peers?Add my study room, come on together
Matplotlib(一)—— 基础
位运算符与逻辑运算符的区别
Algorithms - ones and zeros (Kotlin)
redis事务
shell函数
小白一枚各位大牛轻虐虐
Using QR codes to solve fixed asset management challenges
range函数作用
CAP+BASE
解决端口占用问题
2022 Hangzhou Electric Multi-School 1st Session 01
学习总结week3_3迭代器_模块
OFDM 十六讲 5 -Discrete Convolution, ISI and ICI on DMT/OFDM Systems
Multi-threaded query results, add List collection