当前位置:网站首页>Kdd2022 | graphmae: self supervised mask map self encoder

Kdd2022 | graphmae: self supervised mask map self encoder

2022-06-12 21:30:00 Zhiyuan community

Thesis link :https://arxiv.org/pdf/2205.10803.pdf

Self supervised learning (Self-supervised learning, SSL) It is a research hotspot in recent years . especially , Generative SSL It has been successful in naturallanguageprocessing and other fields , for example BERT and GPT It is widely used . For all that , Comparative learning ( Heavily dependent on structured data enhancement and complex training strategies ) Always a graph SSL Main methods , And generate on the graph SSL The progress of the , In particular, figure self encoder (GAEs), The potential promised in other areas has not yet been achieved . In this paper , We have identified and studied the GAEs Problems that have a negative impact on development , Including their reconstruction goals 、 Training robustness and error measurement . We propose a mask graph self encoder GraphMAE, It alleviates these problems of generative self supervised graph learning . Instead of rebuilding the structure , We propose to focus on feature reconstruction with masking strategy and scaling cosine error , advantageous to GraphMAE Robust training . We are 21 Extensive experiments were carried out on a public data set , For three different graph learning tasks . It turns out that ,GraphMAE— A simple graph self encoder designed by us — Ability to consistently generate performance better than baseline comparison and baseline generation . This study provides an understanding of the automatic graph coder , It also shows the potential of generative self supervised learning on graphs .

原网站

版权声明
本文为[Zhiyuan community]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/163/202206122128187351.html