当前位置:网站首页>Linear algebra of deep learning

Linear algebra of deep learning

2022-07-07 00:41:00 Peng Xiang

Here we mainly introduce the calculation of some tensors , If sum , Transpose, etc
Scalar operation

import torch
x=torch.tensor(3.0)
y=torch.tensor(4.0)
print(x*y,x+y,x-y,x**y,x/y)# This single element ( Scalar ) It can perform all kinds of four operations 
print(x.shape)

 Insert picture description here
Matrix transposition

import torch
x=torch.arange(20).reshape(5,4)
print(x)
print(x.t())# Matrix transposition 
B = torch.tensor([[1, 2, 3], [2, 0, 4], [3, 4, 5]])
print(B)
print(B==B.t())# The symmetric matrix is the same as the original matrix after transposition 

 Insert picture description here
About python Assignment in , It just assigns the address to a variable , When it changes , Will change together , You can use y=x.clone() To regenerate data

# Dimension reduction 
import torch
A = torch.arange(12, dtype=torch.float32).reshape(3,4)
A_sum_axis0 = A.sum([1])#0 To sum by column ,1 To sum by line ,[0,1] For all , At this point, dimensionality reduction is achieved 
print(A,A_sum_axis0, A_sum_axis0.shape)
A = torch.arange(24, dtype=torch.float32).reshape(2,3,4)
sum_A = A.sum(1)#3 Dimension time loses a dimension and becomes a dimension , That is, the row dimension is missing , Empathy ,0 For losing the first ,2 For the third 
print(A_sum_axis0)
print(A)

 Insert picture description here
keepdims It will turn the dimension into one
 Insert picture description here

Sum by accumulation

import torch
A = torch.arange(12, dtype=torch.float32).reshape(3,4)
A_sum_axis0=A.cumsum(0)
print(A)
print(A_sum_axis0)

 Insert picture description here

import torch
y = torch.ones(4, dtype = torch.float32)
print(y)
print(torch.dot(y,y))# Vector dot product 
y = torch.ones(4, dtype = torch.float32)
x=torch.arange(12,dtype = torch.float32).reshape(3,4)
print(torch.mv(x,y))# vector * matrix 
B = torch.ones(4, 3)
print(torch.mm(x, B))# matrix * matrix 
原网站

版权声明
本文为[Peng Xiang]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/188/202207061654112250.html