当前位置:网站首页>[reading notes] Figure comparative learning gnn+cl
[reading notes] Figure comparative learning gnn+cl
2022-07-05 09:16:00 【Virgo programmer's friend】
source : https://mp.weixin.qq.com/s/X7gxlcY-PaQ97MiEJmKfbg
For a given large number of unlabeled graph data , The graph contrast learning algorithm aims to train a graph encoder , At present, it generally refers to graph neural network (Graph Neural Network, GNN). By this GNN The encoded graph represents the vector , The characteristics of graph data can be well preserved .

Graph Contrastive Learning with Augmentations. NeurIPS 2020.
Algorithm steps :
1. Random sampling of a batch (batch) chart
2. Perform random data enhancement twice for each graph ( Add / delete edge / Discard nodes ) Get the new picture (view)
3. Use the to be trained GNN Yes View Encoding , Get the node representation vector (node representation) And graph represents vector (graph representations)
4. Calculate according to the above representation vector InfoNCE Loss , Among them, the same graph Enhanced view Are close to each other , By different graph Enhanced view Are far away from each other ;【 Features are enhanced 】
【 Heuristic graph data enhancement 】 As the graph data passes GNN It will produce Nodes represent and The picture shows Two levels of representation vectors Contrastive Multi-View Representation Learning on Graphs. ICML 2020. Design experiments to analyze the comparison of different levels , It is found that comparing the node representation with the graph representation will achieve better results . wuhu ~
【Learning Method graph data enhancement 】JOAO: Through confrontation training (adversarial training) The way , Iterative training selects each data enhancement method 【 semi-automatic 】 The probability matrix of , And corresponding replacement GraphCL Mapping header in (projection head). Experimental results show that , The probability matrix obtained from confrontation training is the same as before GraphCL The trend of experimental results on data enhancement selection is similar , And achieved competitive results without too much manual intervention .
【 Fully automatic 】 Automatically learn the distribution of disturbance to the graph during data enhancement .Adversarial Graph Augmentation to Improve Graph Contrastive Learning The author starts from how to preserve the information of the graph in data enhancement , Suppose the enhanced two View The greater the mutual information, the better , Because these mutual information may contain a lot of noise . The author introduces the information bottleneck (Information Bottleneck) principle , Think better View It should be on the premise of jointly preserving the characteristics of the graph itself , Mutual information between each other is minimal . That is, in training , Learn how to enhance retention graph Necessary information in , And at the same time reduce noise . Based on this principle , The author designed min-max game It's a new training mode , And train the neural network to decide whether to delete an edge in the data enhancement .【 pruning strategy ?】
————————————————
Copyright notice : This paper is about CSDN Blogger 「Amber_7422」 The original article of , follow CC 4.0 BY-SA Copyright agreement , For reprint, please attach the original source link and this statement .
Link to the original text :https://blog.csdn.net/Amber_7422/article/details/123773606
边栏推荐
- Kotlin introductory notes (VII) data class and singleton class
- Applet data attribute method
- RT thread kernel quick start, kernel implementation and application development learning with notes
- Causes and appropriate analysis of possible errors in seq2seq code of "hands on learning in depth"
- 信息与熵,你想知道的都在这里了
- Generate confrontation network
- Alibaba cloud sends SMS verification code
- Can't find the activitymainbinding class? The pit I stepped on when I just learned databinding
- Applet (subcontracting)
- File server migration scheme of a company
猜你喜欢

Applet global style configuration window

Introduction Guide to stereo vision (1): coordinate system and camera parameters

OpenGL - Lighting

2020 "Lenovo Cup" National College programming online Invitational Competition and the third Shanghai University of technology programming competition
![C [essential skills] use of configurationmanager class (use of file app.config)](/img/8b/e56f87c2d0fbbb1251ec01b99204a1.png)
C [essential skills] use of configurationmanager class (use of file app.config)
![一题多解,ASP.NET Core应用启动初始化的N种方案[上篇]](/img/c4/27ae0d259abc4e61286c1f4d90c06a.png)
一题多解,ASP.NET Core应用启动初始化的N种方案[上篇]

生成对抗网络

什么是防火墙?防火墙基础知识讲解

嗨 FUN 一夏,与 StarRocks 一起玩转 SQL Planner!
![[beauty of algebra] singular value decomposition (SVD) and its application to linear least squares solution ax=b](/img/ee/8e07e2dd89bed63ff44400fe1864a9.jpg)
[beauty of algebra] singular value decomposition (SVD) and its application to linear least squares solution ax=b
随机推荐
阿里云发送短信验证码
C [essential skills] use of configurationmanager class (use of file app.config)
Global configuration tabbar
Kotlin introductory notes (IV) circular statements (simple explanation of while, for)
Kotlin introductory notes (V) classes and objects, inheritance, constructors
OpenGL - Lighting
[beauty of algebra] solution method of linear equations ax=0
Summary and Reflection on issues related to seq2seq, attention and transformer in hands-on deep learning
[ctfhub] Title cookie:hello guest only admin can get flag. (cookie spoofing, authentication, forgery)
利用请求头开发多端应用
Wxml template syntax
Configuration and startup of kubedm series-02-kubelet
Understanding rotation matrix R from the perspective of base transformation
scipy. misc. imread()
C语言-从键盘输入数组二维数组a,将a中3×5矩阵中第3列的元素左移到第0列,第3列以后的每列元素行依次左移,原来左边的各列依次绕到右边
Codeforces Round #648 (Div. 2) E.Maximum Subsequence Value
Huber Loss
Generate confrontation network
Driver's license physical examination hospital (114-2 hang up the corresponding hospital driver physical examination)
Codeworks round 639 (Div. 2) cute new problem solution