咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Analyzing Arabic Twitter-Based... 收藏

Analyzing Arabic Twitter-Based Patient Experience Sentiments Using Multi-Dialect Arabic Bidirectional Encoder Representations from Transformers

作     者:Sarab AlMuhaideb Yasmeen AlNegheimish Taif AlOmar Reem AlSabti Maha AlKathery Ghala AlOlyyan 

作者机构:Department of Computer ScienceCollege of Computer and Information SciencesKing Saud UniversityP.O.Box 266Riyadh 11362Saudi Arabia 

出 版 物:《Computers, Materials & Continua》 (计算机、材料和连续体(英文))

年 卷 期:2023年第76卷第7期

页      面:195-220页

核心收录:

学科分类:0831[工学-生物医学工程(可授工学、理学、医学学位)] 1002[医学-临床医学] 1001[医学-基础医学(可授医学、理学学位)] 0805[工学-材料科学与工程(可授工学、理学学位)] 100214[医学-肿瘤学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 10[医学] 

主  题:Sentiment analysis patient experience healthcare Twitter MARBERT bidirectional long short-term memory support vector machine transformer-based learning deep learning 

摘      要:Healthcare organizations rely on patients’feedback and experiences to evaluate their performance and services,thereby allowing such organizations to improve inadequate services and address any *** to the literature,social networks and particularly Twitter are effective platforms for gathering public ***,recent studies have used natural language processing to measure sentiments in text segments collected from Twitter to capture public opinions about various sectors,including *** present study aimed to analyze Arabic Twitter-based patient experience sentiments and to introduce an Arabic patient experience *** authors collected 12,400 tweets from Arabic patients discussing patient experiences related to healthcare organizations in Saudi Arabia from 1 January 2008 to 29 January *** tweets were labeled according to sentiment(positive or negative)and sector(public or private),and thereby the Hospital Patient Experiences in Saudi Arabia(HoPE-SA)dataset was produced.A simple statistical analysis was conducted to examine differences in patient views of healthcare *** authors trained five models to distinguish sentiments in tweets automatically with the following schemes:a transformer-based model fine-tuned with deep learning architecture and a transformer-based model fine-tuned with simple architecture,using two different transformer-based embeddings based on Bidirectional Encoder Representations from Transformers(BERT),Multi-dialect Arabic BERT(MAR-BERT),and multilingual BERT(mBERT),as well as a pretrained word2vec model with a support vector machine *** is the first study to investigate the use of a bidirectional long short-term memory layer followed by a feedforward neural network for the fine-tuning of *** deep-learning fine-tuned MARBERT-based model—the authors’best-performing model—achieved accuracy,micro-F1,and macro-F1 scores of 98.71%,98.73%,and 98.63%,respectively.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分