咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Convolutional Multi-Head Self-... 收藏

Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification

Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification

作     者:Yaojie Zhang Bing Xu Tiejun Zhao Yaojie Zhang;Bing Xu;Tiejun Zhao

作者机构:the Laboratory of Machine Intelligence and TranslationDepartment of Computer ScienceHarbin Institute of TechnologyHarbin 150001China 

出 版 物:《IEEE/CAA Journal of Automatica Sinica》 (自动化学报(英文版))

年 卷 期:2020年第7卷第4期

页      面:1038-1044页

核心收录:

学科分类:081203[工学-计算机应用技术] 08[工学] 0835[工学-软件工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:supported by the National Key Research and Development Program of China(2018YFC0830700) 

主  题:Aspect sentiment classification deep learning memory network sentiment analysis(SA) 

摘      要:This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network(CMA-Mem Net). This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory network’s inability to capture context-related information on a word-level,we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network(RNN) long short term memory(LSTM), gated recurrent unit(GRU) models, we retain the parallelism of the network. We experiment on the open datasets Sem Eval-2014 Task 4 and Sem Eval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分