当前位置:网站首页>Learning notes of "hands on learning in depth"
Learning notes of "hands on learning in depth"
2022-07-05 05:05:00 【fb_ help】
《 Hands-on deep learning 》 Learning notes
Data manipulation
Automatic derivation
Variable setting derivation
x = torch.ones(2, 2, requires_grad=True)
Obtain derivative
out.backward()
Here, the dependent variable is required to be scalar , If tensor, You need to provide a weight matrix with the same size as the dependent variable , The output dependent variable is transformed into a scalar by weighted summation of all elements , Then you can backward().
The reason is well understood : There is no relationship between dependent variables , All dependent variable elements are simply put together , Therefore, their permutation can be regarded as a one-dimensional vector and linearly weighted to a scalar l On . The advantage of this is that the gradient is independent of the latitude of the dependent variable , Got l And the gradient of the independent variable , It doesn't matter what shape the dependent variable is tensor.
Linear regression
The gradient descent method solves the parameters
The above formula can be solved analytically .
The parameters to be solved can also be optimized by gradient descent through linear regression . Small batch random gradient descent is usually used (mini-batch stochastic gradient descent) Method , That is, calculate the average gradient in small batches , Multi batch optimization parameters .
Here is also multiplied by a learning rate , It is equivalent to step size , It can be a little bigger at first , The back should be smaller .
Fully connected layer ( Dense layer )
Linear regression
- Build simulation data , Confirm the input (features), Input (label), Parameters
- Write data loader( Split data into batch)
- structure function (net) , loss And optimization method
- iteration epoch solve
Linear regression concise version
data fetch
Define your own function, You need to give the number of parameters and forward function . In fact, it is the functional relationship between input and output . therefore , Give the calculation method of input and output , namely forward function . Neural network replaces this functional relationship with grid structure .
use torch Of net structure
Loss
An optimization method
边栏推荐
- Research and investment forecast report of adamantane industry in China (2022 Edition)
- XSS injection
- Use assimp library to read MTL file data
- 2022/7/2做题总结
- 嵌入式数据库开发编程(六)——C API
- MySQL audit log Archive
- Unity find the coordinates of a point on the circle
- Page countdown
- UE 虚幻引擎,项目结构
- Transport connection management of TCP
猜你喜欢
Autocad-- dynamic zoom
UE4/UE5 虚幻引擎,材质篇,纹理,Compression and Memory压缩和内存
Chinese notes of unit particle system particle effect
Recherche de mots pour leetcode (solution rétrospective)
2022 thinking of mathematical modeling D problem of American college students / analysis of 2022 American competition D problem
Rip notes [rip message security authentication, increase of rip interface measurement]
Grail layout and double wing layout
PostgreSQL 超越 MySQL,“世界上最好的编程语言”薪水偏低
An article takes you to thoroughly understand descriptors
AutoCAD - continuous annotation
随机推荐
AutoCAD - Document Management
Animation
Common technologies of unity
Unity parallax infinite scrolling background
54. 螺旋矩阵 & 59. 螺旋矩阵 II ●●
#775 Div.1 C. Tyler and Strings 组合数学
Rip notes [rip message security authentication, increase of rip interface measurement]
How much do you know about 3DMAX rendering skills and HDRI light sources? Dry goods sharing
【论文笔记】Multi-Goal Reinforcement Learning: Challenging Robotics Environments and Request for Research
2021-10-29
Pdf to DWG in CAD
win下一键生成当日的时间戳文件
PR first time
54. Spiral matrix & 59 Spiral matrix II ●●
BUUCTF MISC
Basic knowledge points of dictionary
小程序直播+电商,想做新零售电商就用它吧!
How to choose a panoramic camera that suits you?
Dotween usage records ----- appendinterval, appendcallback
Understand encodefloatrgba and decodefloatrgba