当前位置:网站首页>Learning notes of "hands on learning in depth"
Learning notes of "hands on learning in depth"
2022-07-05 05:05:00 【fb_ help】
《 Hands-on deep learning 》 Learning notes
Data manipulation

Automatic derivation
Variable setting derivation
x = torch.ones(2, 2, requires_grad=True)
Obtain derivative
out.backward()
Here, the dependent variable is required to be scalar , If tensor, You need to provide a weight matrix with the same size as the dependent variable , The output dependent variable is transformed into a scalar by weighted summation of all elements , Then you can backward().
The reason is well understood : There is no relationship between dependent variables , All dependent variable elements are simply put together , Therefore, their permutation can be regarded as a one-dimensional vector and linearly weighted to a scalar l On . The advantage of this is that the gradient is independent of the latitude of the dependent variable , Got l And the gradient of the independent variable , It doesn't matter what shape the dependent variable is tensor.
Linear regression
The gradient descent method solves the parameters 
The above formula can be solved analytically .
The parameters to be solved can also be optimized by gradient descent through linear regression . Small batch random gradient descent is usually used (mini-batch stochastic gradient descent) Method , That is, calculate the average gradient in small batches , Multi batch optimization parameters .
Here is also multiplied by a learning rate , It is equivalent to step size , It can be a little bigger at first , The back should be smaller .
Fully connected layer ( Dense layer )


Linear regression
- Build simulation data , Confirm the input (features), Input (label), Parameters
- Write data loader( Split data into batch)
- structure function (net) , loss And optimization method
- iteration epoch solve

Linear regression concise version
data fetch

Define your own function, You need to give the number of parameters and forward function . In fact, it is the functional relationship between input and output . therefore , Give the calculation method of input and output , namely forward function . Neural network replaces this functional relationship with grid structure .

use torch Of net structure

Loss

An optimization method

边栏推荐
- BUUCTF MISC
- Sixth note
- AutoCAD - full screen display
- 2020-10-27
- Unity synergy
- 2022 thinking of mathematical modeling C problem of American college students / analysis of 2022 American competition C problem
- [LeetCode] 整数反转【7】
- Panel panel of UI
- Séparation et combinaison de la construction du système qualité
- 【Leetcode】1352. Product of the last K numbers
猜你喜欢
随机推荐
Lua wechat avatar URL
Unity parallax infinite scrolling background
Detailed introduction of OSPF header message
Page countdown
LeetCode之單詞搜索(回溯法求解)
2020-10-27
mysql审计日志归档
UE4/UE5 虚幻引擎,材质篇,纹理,Compression and Memory压缩和内存
MySQL audit log archiving
2021-10-29
Ue4/ue5 illusory engine, material chapter, texture, compression and memory compression and memory
Unity and database
Panel panel of UI
64 horses, 8 tracks, how many times does it take to find the fastest 4 horses at least
China as resin Market Research and investment forecast report (2022 Edition)
質量體系建設之路的分分合合
2022 / 7 / 1 Résumé de l'étude
AutoCAD - feature matching
AutoCAD - Document Management
MySQL audit log Archive









