咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Re-quantization based binary g... 收藏

Re-quantization based binary graph neural networks

作     者:Kai-Lang YAO Wu-Jun LI 

作者机构:National Key Laboratory for Novel Software Technology Department of Computer Science and TechnologyNanjing University 

出 版 物:《Science China(Information Sciences)》 (中国科学:信息科学(英文版))

年 卷 期:2024年第67卷第7期

页      面:160-171页

核心收录:

学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:supported by National Key R&D Program of China (Grant No. 2020YFA0713901) National Natural Science Foundation of China (Grant Nos. 61921006, 62192783) Fundamental Research Funds for the Central Universities (Grant No. 020214380108) 

主  题:graph neural networks binary neural networks mixture of experts computation-efficient algorithms 

摘      要:Binary neural networks have become a promising research topic due to their advantages of fast inference speed and low energy consumption. However, most existing studies focus on binary convolutional neural networks, while less attention has been paid to binary graph neural networks. A common drawback of existing studies on binary graph neural networks is that they still include lots of inefficient full-precision operations in multiplying three matrices and are therefore not efficient enough. In this paper, we propose a novel method, called re-quantization-based binary graph neural networks(RQBGN), for binarizing graph neural networks. Specifically, re-quantization, a necessary procedure contributing to the further reduction of superfluous inefficient full-precision operations, quantizes the results of multiplication between any two matrices during the process of multiplying three matrices. To address the challenges introduced by requantization, in RQBGN we first study the impact of different computation orders to find an effective one and then introduce a mixture of experts to increase the model capacity. Experiments on five benchmark datasets show that performing re-quantization in different computation orders significantly impacts the performance of binary graph neural network models, and RQBGN can outperform other baselines to achieve state-of-the-art performance.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分