当前位置:网站首页>Neural network - nonlinear activation
Neural network - nonlinear activation
2022-07-01 04:45:00 【booze-J】
summary :
Nonlinear activation is mainly to introduce some nonlinear characteristics into our neural network , Two commonly used nonlinear activation are Sigmoid and RELU.
RELU
Sample code :
import torch
from torch import nn
from torch.nn import ReLU
input = torch.tensor([[1,-0.5],
[-1,3]])
input = torch.reshape(input,(-1,1,2,2))
print(input.shape)
# Building neural networks
class Booze(nn.Module):
def __init__(self):
super(Booze, self).__init__()
# inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive
# Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost
self.relu1 = ReLU()
# rewrite forword Method
def forward(self,input):
output = self.relu1(input)
return output
obj = Booze()
output = obj(input)
print(output)
Code run results :
about ReLU Use of interfaces , This interface has only one parameter that needs to be passed , That's it inplace Parameters .
torch.nn.ReLU(inplace=False)
inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive .
Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost .
The picture above is ReLU Activate the image of the function , Explanation of the above figure : Less than zero for truncation , If it is greater than zero, the original value will be output .
Sigmoid
Sample code :
import torch
import torchvision
from torch import nn
from torch.nn import ReLU, Sigmoid
from torch.utils.data import DataLoader
from torch.utils.tensorboard import SummaryWriter
dataset = torchvision.datasets.CIFAR10("./CIFAR10",train=False,transform=torchvision.transforms.ToTensor(),download=True)
dataloader = DataLoader(dataset,batch_size=64)
# Building neural networks
class Booze(nn.Module):
def __init__(self):
super(Booze, self).__init__()
# inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive
# Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost
self.relu1 = ReLU()
self.sigmoid1 = Sigmoid()
# rewrite forward Method
def forward(self,input):
# output = self.relu1(input)
output = self.sigmoid1(input)
return output
obj = Booze()
# Use tensorboard Visualizing
writer = SummaryWriter('logs')
step = 0
for data in dataloader:
imgs,targets = data
writer.add_images("input",imgs,step)
# Use neural network to process pictures
output = obj(imgs)
writer.add_images("output",output,step)
step+=1
writer.close()
Sigmoid and RELU The use method of is similar, so I won't repeat it here .
The above code results in tensorboard The visualization in is shown in the following figure :
Nonlinear activation sigmoid The effect before and after function processing is still quite obvious .
summary
The main purpose of nonlinear transformation is to introduce some nonlinear features into our network . Because there are many nonlinearity , You can train a model that conforms to various curves or characteristics , If everyone is straight , The generalization ability of the model is not good enough .
边栏推荐
- Matters behind the construction of paint testing laboratory
- Simple implementation of slf4j
- C - detailed explanation of operators and summary of use cases
- One click shell to automatically deploy any version of redis
- Pytorch(一) —— 基本语法
- Solve the problem that the external chain file of Qiankun sub application cannot be obtained
- 科研狗可能需要的一些工具
- 2022 hoisting machinery command registration examination and hoisting machinery command examination registration
- VR线上展览所具备应用及特色
- STM32扩展版 按键扫描
猜你喜欢

Shell之一键自动部署Redis任意版本

How to do the performance pressure test of "Health Code"

Ten wastes of software research and development: the other side of research and development efficiency

CF1638E. Colorful operations Kodori tree + differential tree array

Registration of P cylinder filling examination in 2022 and analysis of P cylinder filling

Sorting out 49 reports of knowledge map industry conference | AI sees the future with wisdom

VR线上展览所具备应用及特色

Kodori tree board

Applications and features of VR online exhibition

神经网络的基本骨架-nn.Moudle的使用
随机推荐
常用的Transforms中的方法
Leecode records the number of good segmentation of 1525 strings
分布式事务-解决方案
One click shell to automatically deploy any version of redis
Common UNIX Operation and maintenance commands of shell
Difference between cookie and session
How to use maixll dock
Common methods in transforms
STM32扩展板 数码管显示
How to do the performance pressure test of "Health Code"
VIM easy to use tutorial
MySQL winter vacation self-study 2022 12 (5)
LeetCode_53(最大子数组和)
OdeInt与GPU
Basic exercise of test questions hexadecimal to decimal
Openresty rewrites the location of 302
软件研发的十大浪费:研发效能的另一面
STM32 photoresistor sensor & two channel AD acquisition
Odeint and GPU
JVM栈和堆简介