当前位置:网站首页>Learning notes of "hands on learning in depth"
Learning notes of "hands on learning in depth"
2022-07-05 05:05:00 【fb_ help】
《 Hands-on deep learning 》 Learning notes
Data manipulation

Automatic derivation
Variable setting derivation
x = torch.ones(2, 2, requires_grad=True)
Obtain derivative
out.backward()
Here, the dependent variable is required to be scalar , If tensor, You need to provide a weight matrix with the same size as the dependent variable , The output dependent variable is transformed into a scalar by weighted summation of all elements , Then you can backward().
The reason is well understood : There is no relationship between dependent variables , All dependent variable elements are simply put together , Therefore, their permutation can be regarded as a one-dimensional vector and linearly weighted to a scalar l On . The advantage of this is that the gradient is independent of the latitude of the dependent variable , Got l And the gradient of the independent variable , It doesn't matter what shape the dependent variable is tensor.
Linear regression
The gradient descent method solves the parameters 
The above formula can be solved analytically .
The parameters to be solved can also be optimized by gradient descent through linear regression . Small batch random gradient descent is usually used (mini-batch stochastic gradient descent) Method , That is, calculate the average gradient in small batches , Multi batch optimization parameters .
Here is also multiplied by a learning rate , It is equivalent to step size , It can be a little bigger at first , The back should be smaller .
Fully connected layer ( Dense layer )


Linear regression
- Build simulation data , Confirm the input (features), Input (label), Parameters
- Write data loader( Split data into batch)
- structure function (net) , loss And optimization method
- iteration epoch solve

Linear regression concise version
data fetch

Define your own function, You need to give the number of parameters and forward function . In fact, it is the functional relationship between input and output . therefore , Give the calculation method of input and output , namely forward function . Neural network replaces this functional relationship with grid structure .

use torch Of net structure

Loss

An optimization method

边栏推荐
猜你喜欢

Stm32cubemx (8): RTC and RTC wake-up interrupt

Introduction to JVM principle and process

Use assimp library to read MTL file data

2022 thinking of mathematical modeling D problem of American college students / analysis of 2022 American competition D problem

UE4/UE5 虚幻引擎,材质篇(三),不同距离的材质优化

Redis 排查大 key 的4种方法,优化必备

AutoCAD - full screen display

AutoCAD - set layer

An article takes you to thoroughly understand descriptors
![Rip notes [rip three timers, the role of horizontal segmentation, rip automatic summary, and the role of network]](/img/e7/f699ee982ea325b8d04f8bd467a559.jpg)
Rip notes [rip three timers, the role of horizontal segmentation, rip automatic summary, and the role of network]
随机推荐
775 Div.1 B. integral array mathematics
用 Jmeter 工具做个小型压力测试
PR first time
mysql审计日志归档
2022/7/1 learning summary
中国溶聚丁苯橡胶(SSBR)行业研究与预测报告(2022版)
SQLServer 存储过程传递数组参数
Download and use of font icons
3dsmax snaps to frozen objects
AutoCAD - feature matching
Unity enables mobile phone vibration
Panel panel of UI
Unity sends messages and blocks indecent words
A three-dimensional button
2022/7/1學習總結
AutoCAD -- dimension break
質量體系建設之路的分分合合
AutoCAD - continuous annotation
3dsmax common commands
Lua GBK and UTF8 turn to each other