当前位置:网站首页>Introduction notes to pytorch deep learning (XII) neural network - nonlinear activation
Introduction notes to pytorch deep learning (XII) neural network - nonlinear activation
2022-06-30 07:35:00 【Snow fish】
Course notes , Course link
The learning notes are synchronously posted on my Personal website On , Welcome to check .
One 、 Introduction to common functions of nonlinear activation
The purpose of nonlinear activation is to introduce some nonlinear characteristics into our neural network .
Still open Official documents :
The commonly used function is nn.ReLu:
1.1 ReLU

The corresponding function diagram is :
Parameters :inplace=True when , Will modify input Is the result of nonlinear activation ;inplace=False when , It doesn't change input ,input Still the original value .
Sample code :
import torch
from torch import nn
from torch.nn import ReLU
input = torch.tensor([[1, -0.5],
[-1, 3]])
output = torch.reshape(input, (-1, 1, 2, 2))
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.relu1 = ReLU()
def forward(self, input):
output = self.relu1(input)
return output
net1 = Net()
output = net1(input)
print(output)
Output :
You can see that the input is Output is
[[1, -0.5], [[1, 0],
[-1, 3]], [0, 3]]
Input passed to ReLU function , Truncated ,-0.5 and -1 All less than 0, So the corresponding output is 0.
1.2 Sigmoid

The function diagram :
Sample code :
import torch
import torchvision
from torch import nn
from torch.nn import ReLU, Sigmoid
from torch.utils.data import DataLoader
from torch.utils.tensorboard import SummaryWriter
dataset = torchvision.datasets.CIFAR10("./dataset", train=False, download=True,
transform=torchvision.transforms.ToTensor())
dataloader = DataLoader(dataset, batch_size=64)
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.Sigm = Sigmoid()
def forward(self, input):
output = self.Sigm(input)
return output
net1 = Net()
writer = SummaryWriter("logs")
step = 0
for data in dataloader:
imgs, targets = data
writer.add_images("input", imgs, global_step=step)
output = net1(imgs)
writer.add_images("output", output, step)
step += 1
writer.close()
use tensorboard View output :


It can be seen that the main purpose of nonlinear activation is to add nonlinear characteristics to the network , In order to train a generalization model that meets the requirements .
边栏推荐
- Wangbohua: development situation and challenges of photovoltaic industry
- 视频播放器(二):视频解码
- Analysys analysis: online audio content consumption market analysis 2022
- Implementation of binary search in C language
- 系统软件开发基础知识
- nRF52832 GPIO LED
- DXP software uses shortcut keys
- Label the picture below the uniapp picture
- 期末复习-PHP学习笔记6-字符串处理
- Xiashuo think tank: 125 planet updates reported today (packed with 101 meta universe collections)
猜你喜欢

Swiftui creates a beautiful custom press feedback button

Shell command, how much do you know?

期末复习-PHP学习笔记11-PHP-PDO数据库抽象层.

Graphic explanation pads update PCB design basic operation

网络安全-路由原理

C language - student achievement management system

期末複習-PHP學習筆記6-字符串處理

Keil plug-in Usage Summary

动态内存管理

实验一、综合实验【Process on】
随机推荐
QT elementary notes
C language operators
Account command and account authority
Dynamic memory management
Final review -php learning notes 7-php and web page interaction
Basic knowledge of compiling learning records
Directory of software
Basic operation command
视频播放器(二):视频解码
Application of stack -- using stack to realize bracket matching (C language implementation)
Network security and data in 2021: collection of new compliance review articles (215 pages)
Graphic explanation pads update PCB design basic operation
Keil plug-in Usage Summary
Implementation of double linked list in C language
1.someip introduction
Pool de Threads - langage C
Projection point of point on line
The simulation interface does not declare an exception and throws an exception
Lt268 the most convenient TFT-LCD serial port screen chip in the whole network
STM32 key control LED