当前位置:网站首页>Neural network - nonlinear activation
Neural network - nonlinear activation
2022-07-01 04:45:00 【booze-J】
summary :
Nonlinear activation is mainly to introduce some nonlinear characteristics into our neural network , Two commonly used nonlinear activation are Sigmoid and RELU.
RELU
Sample code :
import torch
from torch import nn
from torch.nn import ReLU
input = torch.tensor([[1,-0.5],
[-1,3]])
input = torch.reshape(input,(-1,1,2,2))
print(input.shape)
# Building neural networks
class Booze(nn.Module):
def __init__(self):
super(Booze, self).__init__()
# inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive
# Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost
self.relu1 = ReLU()
# rewrite forword Method
def forward(self,input):
output = self.relu1(input)
return output
obj = Booze()
output = obj(input)
print(output)
Code run results :
about ReLU Use of interfaces , This interface has only one parameter that needs to be passed , That's it inplace Parameters .
torch.nn.ReLU(inplace=False)
inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive .
Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost .
The picture above is ReLU Activate the image of the function , Explanation of the above figure : Less than zero for truncation , If it is greater than zero, the original value will be output .
Sigmoid
Sample code :
import torch
import torchvision
from torch import nn
from torch.nn import ReLU, Sigmoid
from torch.utils.data import DataLoader
from torch.utils.tensorboard import SummaryWriter
dataset = torchvision.datasets.CIFAR10("./CIFAR10",train=False,transform=torchvision.transforms.ToTensor(),download=True)
dataloader = DataLoader(dataset,batch_size=64)
# Building neural networks
class Booze(nn.Module):
def __init__(self):
super(Booze, self).__init__()
# inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive
# Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost
self.relu1 = ReLU()
self.sigmoid1 = Sigmoid()
# rewrite forward Method
def forward(self,input):
# output = self.relu1(input)
output = self.sigmoid1(input)
return output
obj = Booze()
# Use tensorboard Visualizing
writer = SummaryWriter('logs')
step = 0
for data in dataloader:
imgs,targets = data
writer.add_images("input",imgs,step)
# Use neural network to process pictures
output = obj(imgs)
writer.add_images("output",output,step)
step+=1
writer.close()
Sigmoid and RELU The use method of is similar, so I won't repeat it here .
The above code results in tensorboard The visualization in is shown in the following figure :
Nonlinear activation sigmoid The effect before and after function processing is still quite obvious .
summary
The main purpose of nonlinear transformation is to introduce some nonlinear features into our network . Because there are many nonlinearity , You can train a model that conforms to various curves or characteristics , If everyone is straight , The generalization ability of the model is not good enough .
边栏推荐
- Announcement on the list of Guangdong famous high-tech products to be selected in 2021
- 2022 question bank and answers for safety production management personnel of hazardous chemical production units
- Question bank and answers for chemical automation control instrument operation certificate examination in 2022
- 先有网络模型的使用及修改
- Shell analysis server log command collection
- Fitness without equipment
- Cmake selecting compilers and setting compiler options
- 如何看待智慧城市建设中的改变和机遇?
- LeetCode_ 35 (search insertion position)
- 神经网络-非线性激活
猜你喜欢

C read / write application configuration file app exe. Config and display it on the interface

PR 2021 quick start tutorial, learn about the and functions of the timeline panel

2022 tea master (intermediate) examination question bank and tea master (intermediate) examination questions and analysis

最长递增子序列及最优解、动物总重量问题

STM32 photoresistor sensor & two channel AD acquisition

Basic skeleton of neural network nn Use of moudle

【硬十宝典】——1.【基础知识】电源的分类

Dede collection plug-in does not need to write rules

2022年T电梯修理题库及模拟考试

VIM easy to use tutorial
随机推荐
Overview of the construction details of Meizhou veterinary laboratory
pytorch神经网络搭建 模板
LeetCode_35(搜索插入位置)
RuntimeError: mean(): input dtype should be either floating point or complex dtypes.Got Long instead
Sorting out 49 reports of knowledge map industry conference | AI sees the future with wisdom
神经网络-卷积层
Dataloader的使用
How to use maixll dock
LM小型可编程控制器软件(基于CoDeSys)笔记十九:报错does not match the profile of the target
VIM easy to use tutorial
Execution failed for task ‘:app:processDebugResources‘. > A failure occurred while executing com. and
OdeInt與GPU
Use of dataloader
神经网络的基本骨架-nn.Moudle的使用
Pytorch(三) —— 函数优化
[pat (basic level) practice] - [simple simulation] 1064 friends
神经网络-非线性激活
Announcement on the list of Guangdong famous high-tech products to be selected in 2021
软件研发的十大浪费:研发效能的另一面
Question bank and answers for chemical automation control instrument operation certificate examination in 2022