当前位置:网站首页>Kdd2022 | neural network compression of depth map based on antagonistic knowledge distillation

Kdd2022 | neural network compression of depth map based on antagonistic knowledge distillation

2022-06-10 22:10:00 Zhiyuan community

Thesis link :https://arxiv.org/pdf/2205.11678.pdf

Depth map neural network (Deep graph neural networks, GNNs) It can well express the modeling of graph structure data . However , The overloaded architecture of the depth map model makes it difficult to deploy and test on mobile or embedded systems . In order to compress overlapping GNN, Knowledge distillation through teacher-student structure is an effective technology , The key step is to use a predefined distance function to measure the difference between teacher and student networks . However , It may not be appropriate to use the same distance for diagrams of various structures , It is difficult to determine the optimal distance formula . To solve these problems , We propose a new framework of antagonistic knowledge distillation , Name it GraphAKD, It confrontationally trains a discriminator and a generator , Adaptive detection and reduction of differences . Specially , Note that well captured inter - node and inter - class correlations are beneficial to depth GNN The success of the , We propose to use trainable discriminators to criticize knowledge inherited from node level and class level views . The discriminator distinguishes between the teacher's knowledge and the student's inherited knowledge , And students GNN As a generator , The purpose is to deceive the discriminator . As far as we know ,GraphAKD It is the first system to introduce antagonistic training into graph domain knowledge distillation . Experiments at node level and graph level show that ,GraphAKD To a great extent, it improves students' performance . It turns out that ,GraphAKD Be able to accurately transfer knowledge from complex teachers GNN Deliver to compact students GNN.

 

原网站

版权声明
本文为[Zhiyuan community]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/161/202206102050472536.html