Pre-Train and Learn: Preserving Global Information for Graph Neural Networks
预先训练并且学习: 为图神经网络保存全球信息作者机构:LibraryJiangsu Police InstituteNanjing 210031China Department of Computer Science and TechnologyNanjing UniversityNanjing 210093China
出 版 物:《Journal of Computer Science & Technology》 (计算机科学技术学报(英文版))
年 卷 期:2021年第36卷第6期
页 面:1420-1430页
核心收录:
学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:partially supported by the Natural Science Foundation of the Jiangsu Higher Education Institutions of China under Grant No.18kJB510010 the Social Science Foundation of Jiangsu Province of China under Grant No.19TQD002 the National Nature Science Foundation of China under Grant No.61976114
主 题:graph neural network network embedding representation learning global information pre-train
摘 要:Graph neural networks(GNNs)have shown great power in learning on ***,it is still a challenge for GNNs to model information faraway from the source *** ability to preserve global information can enhance graph representation and hence improve classification *** the paper,we propose a new learning framework named G-GNN(Global information for GNN)to address the ***,the global structure and global attribute features of each node are obtained via unsupervised pre-training,and those global features preserve the global information associated with the ***,using the pre-trained global features and the raw attributes of the graph,a set of parallel kernel GNNs is used to learn different aspects from these heterogeneous *** general GNN can be used as a kernal and easily obtain the ability of preserving global information,without having to alter their own *** experiments have shown that state-of-the-art models,e.g.,GCN,GAT,Graphsage and APPNP,can achieve improvement with G-GNN on three standard evaluation ***,we establish new benchmark precision records on Cora(84.31%)and Pubmed(80.95%)when learning on attributed graphs.