当前位置:网站首页>Introduction notes to pytorch deep learning (XII) neural network - nonlinear activation
Introduction notes to pytorch deep learning (XII) neural network - nonlinear activation
2022-06-30 07:35:00 【Snow fish】
Course notes , Course link
The learning notes are synchronously posted on my Personal website On , Welcome to check .
One 、 Introduction to common functions of nonlinear activation
The purpose of nonlinear activation is to introduce some nonlinear characteristics into our neural network .
Still open Official documents :
The commonly used function is nn.ReLu:
1.1 ReLU

The corresponding function diagram is :
Parameters :inplace=True when , Will modify input Is the result of nonlinear activation ;inplace=False when , It doesn't change input ,input Still the original value .
Sample code :
import torch
from torch import nn
from torch.nn import ReLU
input = torch.tensor([[1, -0.5],
[-1, 3]])
output = torch.reshape(input, (-1, 1, 2, 2))
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.relu1 = ReLU()
def forward(self, input):
output = self.relu1(input)
return output
net1 = Net()
output = net1(input)
print(output)
Output :
You can see that the input is Output is
[[1, -0.5], [[1, 0],
[-1, 3]], [0, 3]]
Input passed to ReLU function , Truncated ,-0.5 and -1 All less than 0, So the corresponding output is 0.
1.2 Sigmoid

The function diagram :
Sample code :
import torch
import torchvision
from torch import nn
from torch.nn import ReLU, Sigmoid
from torch.utils.data import DataLoader
from torch.utils.tensorboard import SummaryWriter
dataset = torchvision.datasets.CIFAR10("./dataset", train=False, download=True,
transform=torchvision.transforms.ToTensor())
dataloader = DataLoader(dataset, batch_size=64)
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.Sigm = Sigmoid()
def forward(self, input):
output = self.Sigm(input)
return output
net1 = Net()
writer = SummaryWriter("logs")
step = 0
for data in dataloader:
imgs, targets = data
writer.add_images("input", imgs, global_step=step)
output = net1(imgs)
writer.add_images("output", output, step)
step += 1
writer.close()
use tensorboard View output :


It can be seen that the main purpose of nonlinear activation is to add nonlinear characteristics to the network , In order to train a generalization model that meets the requirements .
边栏推荐
- Private method of single test calling object
- Label the picture below the uniapp picture
- halcon:读取摄像头并二值化
- 2022 Research Report on China's intelligent fiscal and tax Market: accurate positioning, integration and diversity
- Graphic explanation pads update PCB design basic operation
- 套接字socket编程——UDP
- 嵌入式测试流程
- Deloitte: investment management industry outlook in 2022
- 期末复习-PHP学习笔记7-PHP与web页面交互
- Experiment 1: comprehensive experiment [process on]
猜你喜欢

网络安全-路由原理

C language implementation sequence stack

TC397 QSPI(CPU)

Parameter calculation of deep learning convolution neural network

Next initializesecuritycontext failed: unknown error (0x80092012) - the revocation function cannot check whether the certificate is revoked.

期末复习-PHP学习笔记7-PHP与web页面交互

线程池——C语言

Test enumeration types with STM32 platform running RT thread

Basic knowledge points

The most convenient serial port screen chip scheme designed at the charging pile in China
随机推荐
Minecraft 1.16.5模组开发(五十) 书籍词典 (Guide Book)
Halcon: read the camera and binary it
DS1302 digital tube clock
Basic operation command
Examen final - notes d'apprentissage PHP 6 - traitement des chaînes
期末复习-PHP学习笔记3-PHP流程控制语句
STM32 register
Final review -php learning notes 8-mysql database
C51 minimum system board infrared remote control LED light on and off
Mailbox application routine of running wild fire RT thread
LabVIEW程序代码更新缓慢
Thread network
Permutation and combination of probability
网络安全-ARP协议和防御
Network security - detailed explanation of VLAN and tunk methods
手机开户股票开户安全吗?开户需要准备什么?
Label the picture below the uniapp picture
Basic knowledge points
视频播放器(二):视频解码
The most convenient serial port screen chip scheme designed at the charging pile in China