当前位置:网站首页>Network learning of pointnet
Network learning of pointnet
2022-06-29 09:02:00 【Master Ma】
1、pytorch in torch.nn Introduction to
torch.nn yes pytorch A function library comes with , It contains some common functions used in neural networks , For example, with learnable parameters nn.Conv2d(),nn.Linear() And have no learnable parameters ( Such as ReLU,pool,DropOut etc. )( The last few are in nn.functional in ), These functions can be placed in constructors , Or you can leave it alone .
It is usually written as :
import torch.nn as nn
import torch.nn.functional as F
Here we write the function in the constructor :
class ConvNet(nn.Module):
def __init__(self):
super().__init__()
self.conv1 = nn.Conv2d(1, 10, 5) # Enter the number of channels 1, Number of output channels 10, Nuclear size 5
self.conv2 = nn.Conv2d(10, 20, 3) # Enter the number of channels 10, Number of output channels 20, Nuclear size 3
# The whole connection layer below Linear The first parameter of refers to the number of input channels , The second parameter refers to the number of output channels
self.fc1 = nn.Linear(20*10*10, 500) # The number of input channels is 2000, The number of output channels is 500
self.fc2 = nn.Linear(500, 10) # The number of input channels is 500, The number of output channels is 10, namely 10 classification
def forward(self,x):
in_size = x.size(0)
out = self.conv1(x)
out = F.relu(out)
out = F.max_pool2d(out, 2, 2)
out = self.conv2(out)
out = F.relu(out)
out = out.view(in_size, -1)
out = self.fc1(out)
out = F.relu(out)
out = self.fc2(out)
out = F.log_softmax(out, dim=1)
return out
2、import torch.utils.data
Data loader , A combination of data sets and samplers , And it can provide multiple threads to process data sets . Use this function when training the model , Used to divide training data into groups , This function throws one set of data at a time . Until all the data is thrown . Is to do a data initialization .
"""
Batch training , Turn the data into a small batch of data for training .
DataLoader Is used to package the data used , Throw a batch of data at a time
"""
import torch
import torch.utils.data as Data
BATCH_SIZE = 5
x = torch.linspace(1, 10, 10) # linspace: Return to one 1 D tensor , Included in interval start and end Evenly spaced on step A little bit
y = torch.linspace(10, 1, 10)
# Put data in a data set
torch_dataset = Data.TensorDataset(x, y)
loader = Data.DataLoader(
# Extract each time from the data set batch size Samples
dataset=torch_dataset,
batch_size=BATCH_SIZE,
shuffle=True,
num_workers=2,
)
def show_batch():
for epoch in range(3): # epoch: The number of iterations
print('Epoch:', epoch)
for batch_id, (batch_x, batch_y) in enumerate(loader):
print(" batch_id:{}, batch_x:{}, batch_y:{}".format(batch_id, batch_x, batch_y))
# print(f' batch_id:{batch_id}, batch_x:{batch_x}, batch_y:{batch_y}')
if __name__ == '__main__':
show_batch()
Output :
Epoch: 0
batch_id:0, batch_x:tensor([ 7., 4., 3., 9., 10.]), batch_y:tensor([4., 7., 8., 2., 1.])
batch_id:1, batch_x:tensor([6., 2., 1., 5., 8.]), batch_y:tensor([ 5., 9., 10., 6., 3.])
Epoch: 1
batch_id:0, batch_x:tensor([ 2., 7., 10., 8., 3.]), batch_y:tensor([9., 4., 1., 3., 8.])
batch_id:1, batch_x:tensor([6., 9., 1., 4., 5.]), batch_y:tensor([ 5., 2., 10., 7., 6.])
Epoch: 2
batch_id:0, batch_x:tensor([10., 3., 9., 6., 8.]), batch_y:tensor([1., 8., 2., 5., 3.])
batch_id:1, batch_x:tensor([1., 4., 2., 7., 5.]), batch_y:tensor([10., 7., 9., 4., 6.])
3、PyTorch in Variable Variables and torch.autograd.Variable
seeing the name of a thing one thinks of its function ,Variable Namely Variable It means . In essence, it is the variable quantity , The difference in int Variable , It is a variable that can change , This is just in line with the back propagation , Parameter update properties .
say concretely , stay pytorch Medium Variable It is a geographical location that stores variable values , The values in it change constantly , Like a basket of eggs , The number of eggs is constantly changing . Who is the egg in it , Nature is pytorch Medium tensor 了 .( in other words ,pytorch All have tensor Calculated , and tensor The parameters are Variable In the form of ). If you use Variable In terms of calculation , The return is also of the same type Variable.
import torch
from torch.autograd import Variable # torch in Variable modular
tensor = torch.FloatTensor([[1,2],[3,4]])
# Put the eggs in the basket , requires_grad Whether to participate in error back propagation , Do you want to calculate the gradient
variable = Variable(tensor, requires_grad=True)
print(tensor)
"""
1 2
3 4
[torch.FloatTensor of size 2x2]
"""
print(variable)
"""
Variable containing:
1 2
3 4
[torch.FloatTensor of size 2x2]
"""
notes :tensor It's not back propagation ,variable It can be propagated back .
Variable Find gradient
Variable When calculating , It will gradually generate a calculation diagram . This diagram is to connect all computing nodes , Finally, when the error is transmitted back , All at once Variable The gradients inside are calculated , and tensor You don't have this ability .
v_out.backward() # simulation v_out The error is transmitted in reverse
print(variable.grad) # initial Variable Gradient of
'''
0.5000 1.0000
1.5000 2.0000
'''
obtain Variable The data in it
direct print(Variable) Only output Variable Data in form , In many cases, it is useless . So we need to change , Turn it into tensor form .
print(variable) # Variable form
"""
Variable containing:
1 2
3 4
[torch.FloatTensor of size 2x2]
"""
print(variable.data) # take variable Form to tensor form
"""
1 2
3 4
[torch.FloatTensor of size 2x2]
"""
print(variable.data.numpy()) # numpy form
"""
[[ 1. 2.]
[ 3. 4.]]
"""
边栏推荐
猜你喜欢

手写VirtualDOM

对比HomeKit、米家,智汀家庭云版有哪些场景化的体验

DevOps到底是什么意思?

2022第六季完美童模 合肥賽區 决賽圓滿落幕

【最全】PS各个版本下载安装及小试牛刀教程(PhotoShop CS3 ~~ PhotoShop 2022)

How to recover data loss of USB flash disk memory card

Compare homekit, MI family, and zhiting family cloud edition for what scene based experiences

MT-yolov6训练及测试

Actual combat memoir starts from webshell to break through the border

Oracle-子查询
随机推荐
ActiveMQ message component publish subscribe redelivery message redelivery
闭关修炼(二十二)session和cookie原理
ES6数据类型Map&Set
Oracle subquery
今天让你知道PMP考试通过率达97%,可信不可信
uni-app获取当前页面路由url
Robcogen tutorial of robot code generator
Wallpaper applet source code double ended wechat Tiktok applet
Intelligent hardware EVT DVT PVT mp
Open3D 最远点采样(FPS)
Measure the level of various chess playing activities through ELO mechanism
The difference and usage of JS for in loop and for of loop
MYSQL行转列例子
Mqtt second session -- emqx high availability cluster implementation
Oracle-子查询
闭关修炼(二十五)基础web安全
Dialogue | prospects and challenges of privacy computing in the digital age
Core development board & debugger
2022年7月产品经理认证招生简章(NPDP)
Open3D 隐藏点移除