咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Quantum self-attention neural ... 收藏

Quantum self-attention neural networks for text classification

作     者:Guangxi LI Xuanqiang ZHAO Xin WANG 

作者机构:Institute for Quantum ComputingBaidu Research Centre for Quantum Software and InformationUniversity of Technology Sydney QICI Quantum Information and Computation InitiativeDepartment of Computer ScienceThe University of Hong Kong Thrust of Artificial IntelligenceInformation HubHong Kong University of Science and Technology(Guangzhou) 

出 版 物:《Science China(Information Sciences)》 (中国科学:信息科学(英文版))

年 卷 期:2024年第67卷第4期

页      面:301-313页

核心收录:

学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 07[理学] 081104[工学-模式识别与智能系统] 08[工学] 070201[理学-理论物理] 0835[工学-软件工程] 081201[工学-计算机系统结构] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 0702[理学-物理学] 

基  金:partially supported by Guangdong Provincial Quantum Science Strategic Initiative (Grant No. GDZX2303007) the support from Quantum Science Center of Guangdong-Hong Kong-Macao Greater Bay Area Baidu-UTS AI Meets Quantum project the China Scholarship Council (Grant No. 201806070139) Australian Research Council Project (Grant No. DP180100691) partially supported by Start-up Fund (Grant No. G0101000151) from The Hong Kong University of Science and Technology (Guangzhou) Innovation Program for Quantum Science and Technology (Grant No. 2021ZD0302901) Education Bureau of Guangzhou Municipality 

主  题:quantum neural networks self-attention natural language processing text classification parameterized quantum circuits 

摘      要:An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing(NLP). Although some efforts based on syntactic analysis have opened the door to research in quantum NLP(QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the quantum selfattention neural network(QSANN), which can compensate for these limitations. Specifically, we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. As a result, QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices. In particular, our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets. We further show that our method exhibits robustness to low-level quantum noises and showcases resilience to quantum neural network architectures.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分