当前位置:网站首页>[point cloud processing paper crazy reading frontier edition 13] - gapnet: graph attention based point neural network for exploring local feature
[point cloud processing paper crazy reading frontier edition 13] - gapnet: graph attention based point neural network for exploring local feature
2022-07-03 09:14:00 【LingbinBu】
GAPNet: Graph Attention based Point Neural Network for Exploiting Local Feature of Point Cloud
Abstract
- Method : This paper presents a new method for point cloud neural network GAPNet, By way of graph attention mechanism Embedded in stacked Multi-Layer-Perceptron (MLP) layers Middle school learning point cloud Local geometric representation of
- introduce GAPLayer, Learn the weight of each point by emphasizing the different weights of the neighborhood attention features
- utilize multi-head mechanism, To be able to make GAPLayer From separate head Aggregate different features
- Propose in the neighborhood attention pooling layer obtain local signature, It is used to improve the robustness of the network
- Code :TenserFlow edition

Method
remember X = { x i ∈ R F , i = 1 , 2 , … , N } X=\left\{x_{i} \in \mathbb{R}^{F}, i=1,2, \ldots, N\right\} X={ xi∈RF,i=1,2,…,N} For input point cloud set, In this paper , F = 3 F=3 F=3, Representation coordinates ( x , y , z ) (x, y, z) (x,y,z).

GAPLayer
Local structure representation
Considering the real application point cloud The number is huge , So the use of k k k-nearest neighbor Tectonic orientation graph G = ( V , E ) G=(V, E) G=(V,E), among V = { 1 , 2 , … , N } V=\{1,2, \ldots, N\} V={ 1,2,…,N} Representation node , E ⊆ V × N i E \subseteq V \times N_{i} E⊆V×Ni edge , N i N_{i} Ni Indication point x i x_{i} xi Set of neighborhoods . Define the edge feature as y i j = ( x i − x i j ) y_{i j}=\left(x_{i}-x_{i j}\right) yij=(xi−xij), among i ∈ V , j ∈ N i i \in V, j \in N_{i} i∈V,j∈Ni, x i j x_{i j} xij Express x i x_{i} xi Of neighboring point x j x_{j} xj.
Single-head GAPLayer
Single-head GAPLayer The structure of the is shown in the figure below 2(b).
To give everyone neighbors Allocate attention , They put forward respectively self-attention mechanism and neighboring-attention mechanism To get each point to its neighbors The attention coefficient of , Pictured 1 Shown . To be specific ,self-attention mechanism By considering the self-geometric information Study self-coefficients;neighboring-attention mechanism By considering neighborhood Focus on local-coefficients.
As an initialization step , Yes point cloud The vertices and edges of , Features mapped to higher dimensions , The dimension of the output is F F F:
x i ′ = h ( x i , θ ) y i j ′ = h ( y i j , θ ) \begin{aligned} x_{i}^{\prime} &=h\left(x_{i}, \theta\right) \\ y_{i j}^{\prime} &=h\left(y_{i j}, \theta\right) \end{aligned} xi′yij′=h(xi,θ)=h(yij,θ)
among h ( ) h() h() Is a parameterized nonlinear function , Selected in the experiment as single-layer neural network , θ \theta θ yes filter Set of learnable parameters .
Through fusion self-coefficients h ( x i ′ , θ ) h\left(x_{i}^{\prime}, \theta\right) h(xi′,θ) and local-coefficients h ( y i j ′ , θ ) h\left(y_{i j}^{\prime}, \theta\right) h(yij′,θ) To get the final attention coefficients, among h ( x i ′ , θ ) h\left(x_{i}^{\prime}, \theta\right) h(xi′,θ) and h ( y i j ′ , θ ) h\left(y_{i j}^{\prime}, \theta\right) h(yij′,θ) Yes output as 1 Dimensional single-layer neural network , LeakyReLU() Is the activation function :
c i j = LeakyRe L U ( h ( x i ′ , θ ) + h ( y i j ′ , θ ) ) c_{i j}=\operatorname{LeakyRe} L U\left(h\left(x_{i}^{\prime}, \theta\right)+h\left(y_{i j}^{\prime}, \theta\right)\right) cij=LeakyReLU(h(xi′,θ)+h(yij′,θ))
Use softmax Normalize these coefficients :
α i j = exp ( c i j ) ∑ k ∈ N i exp ( c i k ) \alpha_{i j}=\frac{\exp \left(c_{i j}\right)}{\sum_{k \in N_{i}} \exp \left(c_{i k}\right)} αij=∑k∈Niexp(cik)exp(cij)
Single-head GAPLayer The goal of is to calculate the value of each point ontextual attention feature. So , Use the calculated normalization coefficient to update the feature of the vertex x ^ i ∈ R F ′ \hat{x}_{i} \in \mathbb{R}^{F^{\prime}} x^i∈RF′ :
x ^ i = f ( ∑ j ∈ N i α i j y i j ′ ) \hat{x}_{i}=f\left(\sum_{j \in N_{i}} \alpha_{i j} y_{i j}^{\prime}\right) x^i=f⎝⎛j∈Ni∑αijyij′⎠⎞
among f ( ) f() f() It's a nonlinear activation function , Used in experiments RELU function .
Multi-head mechanism
In order to obtain sufficient structural information and stable network , We will M M M Independent single-head GAPLayers Splicing , The number of generated channels is M × F ′ M \times F^{\prime} M×F′ Of multi-attention features:
x ^ i ′ = ∥ m M x ^ i ( m ) \hat{x}_{i}^{\prime}=\|_{m}^{M} \hat{x}_{i}^{(m)} x^i′=∥mMx^i(m)
Pictured 2 Shown ,multi-head GAPLayer The output of is multi-attention features and multi-graph features. x ^ i ( m ) \hat{x}_{i}^{(m)} x^i(m) It's No m m m individual head Of attention feature, M M M yes heads The number of , ∥ \| ∥ Indicates the splicing operation between feature channels .
Attention pooling layer
In order to improve the stability and performance of the network , stay multi-graph features Defined on adjacent channels of attention pooling layer:
Y i = ∥ m M max j ∈ N i y i j ′ ( m ) Y_{i}=\|_{m}^{M} \max _{j \in N_{i}} y_{i j}^{\prime(m)} Yi=∥mMj∈Nimaxyij′(m)
GAPNet architecture

This structure is related to PointNet Yes 3 It's different :
- Use attention-aware spatial transform network bring Point cloud It has some transformation invariance
- Do not process single points , Instead, it extracts local features
- Use attention pooling layer obtain local signature, Connect with the middle layer , Used to obtain global descriptor
experiment
Classification

Ablation study


Semantic part segmentation


边栏推荐
- 樹形DP AcWing 285. 沒有上司的舞會
- 【点云处理之论文狂读前沿版9】—Advanced Feature Learning on Point Clouds using Multi-resolution Features and Learni
- 2022-2-13 learn the imitation Niuke project - Project debugging skills
- LeetCode 438. 找到字符串中所有字母异位词
- Just graduate student reading thesis
- LeetCode 508. The most frequent subtree elements and
- 【点云处理之论文狂读经典版8】—— O-CNN: Octree-based Convolutional Neural Networks for 3D Shape Analysis
- 即时通讯IM,是时代进步的逆流?看看JNPF怎么说
- CSDN markdown editor help document
- 低代码前景可期,JNPF灵活易用,用智能定义新型办公模式
猜你喜欢

树形DP AcWing 285. 没有上司的舞会

【点云处理之论文狂读前沿版13】—— GAPNet: Graph Attention based Point Neural Network for Exploiting Local Feature

【点云处理之论文狂读经典版12】—— FoldingNet: Point Cloud Auto-encoder via Deep Grid Deformation

传统办公模式的“助推器”,搭建OA办公系统,原来就这么简单!
![[point cloud processing paper crazy reading classic version 7] - dynamic edge conditioned filters in revolutionary neural networks on Graphs](/img/0a/480f1d1eea6f2ecf84fd5aa96bd9fb.png)
[point cloud processing paper crazy reading classic version 7] - dynamic edge conditioned filters in revolutionary neural networks on Graphs

Format - C language project sub file

【点云处理之论文狂读前沿版9】—Advanced Feature Learning on Point Clouds using Multi-resolution Features and Learni

LeetCode 532. K-diff number pairs in array

Digital management medium + low code, jnpf opens a new engine for enterprise digital transformation

State compression DP acwing 91 Shortest Hamilton path
随机推荐
LeetCode 75. 颜色分类
20220630学习打卡
<, < <,>, > > Introduction in shell
低代码前景可期,JNPF灵活易用,用智能定义新型办公模式
[point cloud processing paper crazy reading classic version 14] - dynamic graph CNN for learning on point clouds
Internet Protocol learning record
【点云处理之论文狂读前沿版8】—— Pointview-GCN: 3D Shape Classification With Multi-View Point Clouds
[point cloud processing paper crazy reading classic version 7] - dynamic edge conditioned filters in revolutionary neural networks on Graphs
With low code prospect, jnpf is flexible and easy to use, and uses intelligence to define a new office mode
LeetCode 515. Find the maximum value in each tree row
Gaussian elimination acwing 883 Gauss elimination for solving linear equations
[point cloud processing paper crazy reading frontier version 8] - pointview gcn: 3D shape classification with multi view point clouds
Noip 2002 popularity group selection number
Pic16f648a-e/ss PIC16 8-bit microcontroller, 7KB (4kx14)
拯救剧荒,程序员最爱看的高分美剧TOP10
LeetCode 715. Range module
LeetCode 532. 数组中的 k-diff 数对
【点云处理之论文狂读经典版14】—— Dynamic Graph CNN for Learning on Point Clouds
Arbre DP acwing 285. Un bal sans patron.
干货!零售业智能化管理会遇到哪些问题?看懂这篇文章就够了