咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >DeBERTa-GRU: Sentiment Analysi... 收藏

DeBERTa-GRU: Sentiment Analysis for Large Language Model

作     者:Adel Assiri Abdu Gumaei Faisal Mehmood Touqeer Abbas Sami Ullah 

作者机构:Department of Informatics for BusinessCollege of BusinessKing Khalid UniversityAbha61421Saudi Arabia Department of Computer ScienceCollege of Computer Engineering and SciencesPrince Sattam bin Abdulaziz UniversityAl-Kharj11942Saudi Arabia School of Electrical and Information EngineeringZhengzhou UniversityZhengzhou450001China Department of Computer Science and TechnologyBeijing University of Chemical TechnologyBeijing100029China Department of Computer ScienceGovernment College University FaisalabadFaisalabadPunjab38000Pakistan 

出 版 物:《Computers, Materials & Continua》 (计算机、材料和连续体(英文))

年 卷 期:2024年第79卷第6期

页      面:4219-4236页

核心收录:

学科分类:1202[管理学-工商管理] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081203[工学-计算机应用技术] 08[工学] 0835[工学-软件工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 081202[工学-计算机软件与理论] 

基  金:King Khalid University  KKU 

主  题:DeBERTa GRU Naive Bayes LSTM sentiment analysis large language model 

摘      要:Modern technological advancements have made social media an essential component of daily *** media allow individuals to share thoughts,emotions,and *** analysis plays the function of evaluating whether the sentiment of the text is positive,negative,neutral,or any other personal emotion to understand the sentiment context of the *** analysis is essential in business and society because it impacts strategic *** analysis involves challenges due to lexical variation,an unlabeled dataset,and text distance *** execution time increases due to the sequential processing of the sequence ***,the calculation times for the Transformer models are reduced because of the parallel *** study uses a hybrid deep learning strategy to combine the strengths of the Transformer and Sequence models while ignoring their *** particular,the proposed model integrates the Decoding-enhanced with Bidirectional Encoder Representations from Transformers(BERT)attention(DeBERTa)and the Gated Recurrent Unit(GRU)for sentiment *** the Decoding-enhanced BERT technique,the words are mapped into a compact,semantic word embedding space,and the Gated Recurrent Unit model can capture the distance contextual semantics *** proposed hybrid model achieves F1-scores of 97%on the Twitter Large Language Model(LLM)dataset,which is much higher than the performance of new techniques.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分