当前位置:网站首页>[reading notes] Figure comparative learning gnn+cl
[reading notes] Figure comparative learning gnn+cl
2022-07-05 09:16:00 【Virgo programmer's friend】
source : https://mp.weixin.qq.com/s/X7gxlcY-PaQ97MiEJmKfbg
For a given large number of unlabeled graph data , The graph contrast learning algorithm aims to train a graph encoder , At present, it generally refers to graph neural network (Graph Neural Network, GNN). By this GNN The encoded graph represents the vector , The characteristics of graph data can be well preserved .
Graph Contrastive Learning with Augmentations. NeurIPS 2020.
Algorithm steps :
1. Random sampling of a batch (batch) chart
2. Perform random data enhancement twice for each graph ( Add / delete edge / Discard nodes ) Get the new picture (view)
3. Use the to be trained GNN Yes View Encoding , Get the node representation vector (node representation) And graph represents vector (graph representations)
4. Calculate according to the above representation vector InfoNCE Loss , Among them, the same graph Enhanced view Are close to each other , By different graph Enhanced view Are far away from each other ;【 Features are enhanced 】
【 Heuristic graph data enhancement 】 As the graph data passes GNN It will produce Nodes represent and The picture shows Two levels of representation vectors Contrastive Multi-View Representation Learning on Graphs. ICML 2020. Design experiments to analyze the comparison of different levels , It is found that comparing the node representation with the graph representation will achieve better results . wuhu ~
【Learning Method graph data enhancement 】JOAO: Through confrontation training (adversarial training) The way , Iterative training selects each data enhancement method 【 semi-automatic 】 The probability matrix of , And corresponding replacement GraphCL Mapping header in (projection head). Experimental results show that , The probability matrix obtained from confrontation training is the same as before GraphCL The trend of experimental results on data enhancement selection is similar , And achieved competitive results without too much manual intervention .
【 Fully automatic 】 Automatically learn the distribution of disturbance to the graph during data enhancement .Adversarial Graph Augmentation to Improve Graph Contrastive Learning The author starts from how to preserve the information of the graph in data enhancement , Suppose the enhanced two View The greater the mutual information, the better , Because these mutual information may contain a lot of noise . The author introduces the information bottleneck (Information Bottleneck) principle , Think better View It should be on the premise of jointly preserving the characteristics of the graph itself , Mutual information between each other is minimal . That is, in training , Learn how to enhance retention graph Necessary information in , And at the same time reduce noise . Based on this principle , The author designed min-max game It's a new training mode , And train the neural network to decide whether to delete an edge in the data enhancement .【 pruning strategy ?】
————————————————
Copyright notice : This paper is about CSDN Blogger 「Amber_7422」 The original article of , follow CC 4.0 BY-SA Copyright agreement , For reprint, please attach the original source link and this statement .
Link to the original text :https://blog.csdn.net/Amber_7422/article/details/123773606
边栏推荐
- The location search property gets the login user name
- Codeforces round 684 (Div. 2) e - green shopping (line segment tree)
- OpenGL - Model Loading
- Jenkins Pipeline 方法(函数)定义及调用
- Explain NN in pytorch in simple terms CrossEntropyLoss
- 什么是防火墙?防火墙基础知识讲解
- Using request headers to develop multi terminal applications
- 优先级队列(堆)
- scipy.misc.imread()
- [technical school] spatial accuracy of binocular stereo vision system: accurate quantitative analysis
猜你喜欢
牛顿迭代法(解非线性方程)
一题多解,ASP.NET Core应用启动初始化的N种方案[上篇]
My experience from technology to product manager
Global configuration tabbar
Codeworks round 639 (Div. 2) cute new problem solution
Introduction Guide to stereo vision (2): key matrix (essential matrix, basic matrix, homography matrix)
c语言指针深入理解
OpenGL - Coordinate Systems
一文详解图对比学习(GNN+CL)的一般流程和最新研究趋势
Huber Loss
随机推荐
Jenkins pipeline method (function) definition and call
Kotlin introductory notes (I) kotlin variables and non variables
一文详解图对比学习(GNN+CL)的一般流程和最新研究趋势
Mengxin summary of LCs (longest identical subsequence) topics
Codeforces Round #648 (Div. 2) D. Solve The Maze
一篇文章带你走进cookie,session,Token的世界
Multiple linear regression (sklearn method)
Explain NN in pytorch in simple terms CrossEntropyLoss
The combination of deep learning model and wet experiment is expected to be used for metabolic flux analysis
深入浅出PyTorch中的nn.CrossEntropyLoss
File server migration scheme of a company
Attention is all you need
嗨 FUN 一夏,与 StarRocks 一起玩转 SQL Planner!
Characteristic Engineering
Node collaboration and publishing
[technical school] spatial accuracy of binocular stereo vision system: accurate quantitative analysis
nodejs_ fs. writeFile
Introduction Guide to stereo vision (4): DLT direct linear transformation of camera calibration [recommended collection]
.NET服务治理之限流中间件-FireflySoft.RateLimit
Golang foundation - the time data inserted by golang into MySQL is inconsistent with the local time