当前位置:网站首页>[reading notes] Figure comparative learning gnn+cl
[reading notes] Figure comparative learning gnn+cl
2022-07-05 09:16:00 【Virgo programmer's friend】
source : https://mp.weixin.qq.com/s/X7gxlcY-PaQ97MiEJmKfbg
For a given large number of unlabeled graph data , The graph contrast learning algorithm aims to train a graph encoder , At present, it generally refers to graph neural network (Graph Neural Network, GNN). By this GNN The encoded graph represents the vector , The characteristics of graph data can be well preserved .
Graph Contrastive Learning with Augmentations. NeurIPS 2020.
Algorithm steps :
1. Random sampling of a batch (batch) chart
2. Perform random data enhancement twice for each graph ( Add / delete edge / Discard nodes ) Get the new picture (view)
3. Use the to be trained GNN Yes View Encoding , Get the node representation vector (node representation) And graph represents vector (graph representations)
4. Calculate according to the above representation vector InfoNCE Loss , Among them, the same graph Enhanced view Are close to each other , By different graph Enhanced view Are far away from each other ;【 Features are enhanced 】
【 Heuristic graph data enhancement 】 As the graph data passes GNN It will produce Nodes represent and The picture shows Two levels of representation vectors Contrastive Multi-View Representation Learning on Graphs. ICML 2020. Design experiments to analyze the comparison of different levels , It is found that comparing the node representation with the graph representation will achieve better results . wuhu ~
【Learning Method graph data enhancement 】JOAO: Through confrontation training (adversarial training) The way , Iterative training selects each data enhancement method 【 semi-automatic 】 The probability matrix of , And corresponding replacement GraphCL Mapping header in (projection head). Experimental results show that , The probability matrix obtained from confrontation training is the same as before GraphCL The trend of experimental results on data enhancement selection is similar , And achieved competitive results without too much manual intervention .
【 Fully automatic 】 Automatically learn the distribution of disturbance to the graph during data enhancement .Adversarial Graph Augmentation to Improve Graph Contrastive Learning The author starts from how to preserve the information of the graph in data enhancement , Suppose the enhanced two View The greater the mutual information, the better , Because these mutual information may contain a lot of noise . The author introduces the information bottleneck (Information Bottleneck) principle , Think better View It should be on the premise of jointly preserving the characteristics of the graph itself , Mutual information between each other is minimal . That is, in training , Learn how to enhance retention graph Necessary information in , And at the same time reduce noise . Based on this principle , The author designed min-max game It's a new training mode , And train the neural network to decide whether to delete an edge in the data enhancement .【 pruning strategy ?】
————————————————
Copyright notice : This paper is about CSDN Blogger 「Amber_7422」 The original article of , follow CC 4.0 BY-SA Copyright agreement , For reprint, please attach the original source link and this statement .
Link to the original text :https://blog.csdn.net/Amber_7422/article/details/123773606
边栏推荐
- Applet data attribute method
- 交通运输部、教育部:广泛开展水上交通安全宣传和防溺水安全提醒
- Blue Bridge Cup provincial match simulation question 9 (MST)
- 混淆矩阵(Confusion Matrix)
- Rebuild my 3D world [open source] [serialization-2]
- Wxml template syntax
- 阿里十年测试带你走进APP测试的世界
- Multiple solutions to one problem, asp Net core application startup initialization n schemes [Part 1]
- uni-app 实现全局变量
- Pearson correlation coefficient
猜你喜欢
Applet customization component
Creation and reference of applet
Applet network data request
L'information et l'entropie, tout ce que vous voulez savoir est ici.
Applet data attribute method
Newton iterative method (solving nonlinear equations)
编辑器-vi、vim的使用
Summary and Reflection on issues related to seq2seq, attention and transformer in hands-on deep learning
Confusing basic concepts member variables local variables global variables
Editor use of VI and VIM
随机推荐
.NET服务治理之限流中间件-FireflySoft.RateLimit
RT thread kernel quick start, kernel implementation and application development learning with notes
Introduction Guide to stereo vision (2): key matrix (essential matrix, basic matrix, homography matrix)
C#绘制带控制点的Bezier曲线,用于点阵图像及矢量图形
嗨 FUN 一夏,与 StarRocks 一起玩转 SQL Planner!
Multiple linear regression (gradient descent method)
Jenkins Pipeline 方法(函数)定义及调用
Applet global style configuration window
Codeforces Round #648 (Div. 2) D. Solve The Maze
Applet network data request
scipy.misc.imread()
Rebuild my 3D world [open source] [serialization-3] [comparison between colmap and openmvg]
Chris LATTNER, the father of llvm: why should we rebuild AI infrastructure software
Golang foundation -- map, array and slice store different types of data
Shutter uses overlay to realize global pop-up
Introduction Guide to stereo vision (1): coordinate system and camera parameters
C语言-从键盘输入数组二维数组a,将a中3×5矩阵中第3列的元素左移到第0列,第3列以后的每列元素行依次左移,原来左边的各列依次绕到右边
Rebuild my 3D world [open source] [serialization-1]
The combination of deep learning model and wet experiment is expected to be used for metabolic flux analysis
阿里云发送短信验证码