Attention-Aware Heterogeneous Graph Neural Network
Attention-Aware Heterogeneous Graph Neural Network作者机构:College of SciencesNortheastern UniversityShenyang 110004China State Key Laboratory of Synthetical Automation for Process IndustriesNortheastern UniversityShenyang 110819China
出 版 物:《Big Data Mining and Analytics》 (大数据挖掘与分析(英文))
年 卷 期:2021年第4卷第4期
页 面:233-241页
核心收录:
学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:supported by the Key Scientific Guiding Project for the Central Universities Research Funds(No.N2008005) the Major Science and Technology Project of Liaoning Province of China(No.2020JH1/10100008) the National Key Research and Development Program of China(No.2018YFB1701104)
主 题:Graph Neural Network(GNN) Heterogeneous Information Network(HIN) embedding
摘 要:As a powerful tool for elucidating the embedding representation of graph-structured data,Graph Neural Networks(GNNs),which are a series of powerful tools built on homogeneous networks,have been widely used in various data mining *** is a huge challenge to apply a GNN to an embedding Heterogeneous Information Network(HIN).The main reason for this challenge is that HINs contain many different types of nodes and different types of relationships between *** contains rich semantic and structural information,which requires a specially designed graph neural ***,the existing HIN-based graph neural network models rarely consider the interactive information hidden between the meta-paths of HIN in the poor embedding of nodes in the *** this paper,we propose an Attention-aware Heterogeneous graph Neural Network(AHNN)model to effectively extract useful information from HIN and use it to learn the embedding representation of ***,we first use node-level attention to aggregate and update the embedding representation of nodes,and then concatenate the embedding representation of the nodes on different ***,the semantic-level neural network is proposed to extract the feature interaction relationships on different meta-paths and learn the final embedding of *** results on three widely used datasets showed that the AHNN model could significantly outperform the state-of-the-art models.