当前位置:网站首页>Learning notes of "hands on learning in depth"
Learning notes of "hands on learning in depth"
2022-07-05 05:05:00 【fb_ help】
《 Hands-on deep learning 》 Learning notes
Data manipulation
Automatic derivation
Variable setting derivation
x = torch.ones(2, 2, requires_grad=True)
Obtain derivative
out.backward()
Here, the dependent variable is required to be scalar , If tensor, You need to provide a weight matrix with the same size as the dependent variable , The output dependent variable is transformed into a scalar by weighted summation of all elements , Then you can backward().
The reason is well understood : There is no relationship between dependent variables , All dependent variable elements are simply put together , Therefore, their permutation can be regarded as a one-dimensional vector and linearly weighted to a scalar l On . The advantage of this is that the gradient is independent of the latitude of the dependent variable , Got l And the gradient of the independent variable , It doesn't matter what shape the dependent variable is tensor.
Linear regression
The gradient descent method solves the parameters
The above formula can be solved analytically .
The parameters to be solved can also be optimized by gradient descent through linear regression . Small batch random gradient descent is usually used (mini-batch stochastic gradient descent) Method , That is, calculate the average gradient in small batches , Multi batch optimization parameters .
Here is also multiplied by a learning rate , It is equivalent to step size , It can be a little bigger at first , The back should be smaller .
Fully connected layer ( Dense layer )
Linear regression
- Build simulation data , Confirm the input (features), Input (label), Parameters
- Write data loader( Split data into batch)
- structure function (net) , loss And optimization method
- iteration epoch solve
Linear regression concise version
data fetch
Define your own function, You need to give the number of parameters and forward function . In fact, it is the functional relationship between input and output . therefore , Give the calculation method of input and output , namely forward function . Neural network replaces this functional relationship with grid structure .
use torch Of net structure
Loss
An optimization method
边栏推荐
- Basic knowledge points of dictionary
- cocos2dx_ Lua particle system
- Rip notes [rip three timers, the role of horizontal segmentation, rip automatic summary, and the role of network]
- Unity shot tracking object
- AutoCAD - window zoom
- China needle coke industry development research and investment value report (2022 Edition)
- 小程序直播+电商,想做新零售电商就用它吧!
- Unity check whether the two objects have obstacles by ray
- Unity parallax infinite scrolling background
- Cocos2dx screen adaptation
猜你喜欢
Collapse of adjacent vertical outer margins
AutoCAD - continuous annotation
Autocad-- Real Time zoom
BUUCTF MISC
嵌入式数据库开发编程(六)——C API
Ue4/ue5 illusory engine, material chapter, texture, compression and memory compression and memory
PostgreSQL surpasses mysql, and the salary of "the best programming language in the world" is low
Stm32cubemx (8): RTC and RTC wake-up interrupt
3dsmax scanning function point connection drawing connection line
【Leetcode】1352. Product of the last K numbers
随机推荐
Do a small pressure test with JMeter tool
SQLServer 存储过程传递数组参数
C # perspective following
Grail layout and double wing layout
Cocos create Jiugongge pictures
Chinese notes of unit particle system particle effect
【Leetcode】1352. Product of the last K numbers
Data is stored in the form of table
Stm32cubemx (8): RTC and RTC wake-up interrupt
Introduction to JVM principle and process
The difference between heap and stack
54. Spiral matrix & 59 Spiral matrix II ●●
The first topic of ape Anthropology
Recherche de mots pour leetcode (solution rétrospective)
MD5绕过
LeetCode之單詞搜索(回溯法求解)
AutoCAD - continuous annotation
Listview is added and deleted at the index
3dsmax2018 common operations and some shortcut keys of editable polygons
cocos_ Lua listview loads too much data