当前位置:网站首页>Neural network - nonlinear activation
Neural network - nonlinear activation
2022-07-01 04:45:00 【booze-J】
summary :
Nonlinear activation is mainly to introduce some nonlinear characteristics into our neural network , Two commonly used nonlinear activation are Sigmoid and RELU.
RELU
Sample code :
import torch
from torch import nn
from torch.nn import ReLU
input = torch.tensor([[1,-0.5],
[-1,3]])
input = torch.reshape(input,(-1,1,2,2))
print(input.shape)
# Building neural networks
class Booze(nn.Module):
def __init__(self):
super(Booze, self).__init__()
# inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive
# Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost
self.relu1 = ReLU()
# rewrite forword Method
def forward(self,input):
output = self.relu1(input)
return output
obj = Booze()
output = obj(input)
print(output)
Code run results :
about ReLU Use of interfaces , This interface has only one parameter that needs to be passed , That's it inplace Parameters .
torch.nn.ReLU(inplace=False)
inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive .
Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost .
The picture above is ReLU Activate the image of the function , Explanation of the above figure : Less than zero for truncation , If it is greater than zero, the original value will be output .
Sigmoid
Sample code :
import torch
import torchvision
from torch import nn
from torch.nn import ReLU, Sigmoid
from torch.utils.data import DataLoader
from torch.utils.tensorboard import SummaryWriter
dataset = torchvision.datasets.CIFAR10("./CIFAR10",train=False,transform=torchvision.transforms.ToTensor(),download=True)
dataloader = DataLoader(dataset,batch_size=64)
# Building neural networks
class Booze(nn.Module):
def __init__(self):
super(Booze, self).__init__()
# inplace The meaning of the parameter is whether to replace the original variable value with the processed result value if inplace=True Replace the original variable , The value of the original variable changes , Become the result value after processing , if inplace=False Then the original variable will not be replaced , The result needs a new variable to receive
# Usually , It is suggested that inplace Pass in False, This ensures that the original data is not lost
self.relu1 = ReLU()
self.sigmoid1 = Sigmoid()
# rewrite forward Method
def forward(self,input):
# output = self.relu1(input)
output = self.sigmoid1(input)
return output
obj = Booze()
# Use tensorboard Visualizing
writer = SummaryWriter('logs')
step = 0
for data in dataloader:
imgs,targets = data
writer.add_images("input",imgs,step)
# Use neural network to process pictures
output = obj(imgs)
writer.add_images("output",output,step)
step+=1
writer.close()
Sigmoid and RELU The use method of is similar, so I won't repeat it here .
The above code results in tensorboard The visualization in is shown in the following figure :
Nonlinear activation sigmoid The effect before and after function processing is still quite obvious .
summary
The main purpose of nonlinear transformation is to introduce some nonlinear features into our network . Because there are many nonlinearity , You can train a model that conforms to various curves or characteristics , If everyone is straight , The generalization ability of the model is not good enough .
边栏推荐
- Common methods in transforms
- RuntimeError: mean(): input dtype should be either floating point or complex dtypes.Got Long instead
- 2022年上海市安全员C证考试题模拟考试题库及答案
- Dual contractual learning: text classification via label aware data augmentation reading notes
- Talk about testdeploy
- Pytorch(四) —— 可视化工具 Visdom
- Leecode records the number of good segmentation of 1525 strings
- Use of dataloader
- Registration of P cylinder filling examination in 2022 and analysis of P cylinder filling
- AssertionError assert I.ndim == 4 and I.shape[1] == 3
猜你喜欢

Question bank and answers for chemical automation control instrument operation certificate examination in 2022

2022 tea master (intermediate) examination question bank and tea master (intermediate) examination questions and analysis

神经网络的基本骨架-nn.Moudle的使用

Registration of P cylinder filling examination in 2022 and analysis of P cylinder filling

手动实现一个简单的栈

I also gave you the MySQL interview questions of Boda factory. If you need to come in and take your own

Use of dataloader

How to do the performance pressure test of "Health Code"

About the transmission pipeline of stage in spark

2022危险化学品生产单位安全生产管理人员题库及答案
随机推荐
洗个冷水澡吧
MySQL winter vacation self-study 2022 12 (5)
[FTP] the solution to "227 entering passive mode" during FTP connection
Leecode record 1351 negative numbers in statistical ordered matrix
软件研发的十大浪费:研发效能的另一面
分布式事务-解决方案
2022 hoisting machinery command registration examination and hoisting machinery command examination registration
Registration of P cylinder filling examination in 2022 and analysis of P cylinder filling
Some tools that research dogs may need
2022 question bank and answers for safety production management personnel of hazardous chemical production units
Pytorch(二) —— 激活函数、损失函数及其梯度
Use of dataloader
STM32 extended key scan
[godot] unity's animator is different from Godot's animplayer
技术分享| 融合调度中的广播功能设计
神经网络-使用Sequential搭建神经网络
TCP server communication flow
[difficult] sqlserver2008r2, can you recover only some files when recovering the database?
JVM栈和堆简介
Section 27 remote access virtual private network workflow and experimental demonstration