Analyzing Arabic Twitter-Based Patient Experience Sentiments Using Multi-Dialect Arabic Bidirectional Encoder Representations from Transformers
作者机构:Department of Computer ScienceCollege of Computer and Information SciencesKing Saud UniversityP.O.Box 266Riyadh 11362Saudi Arabia
出 版 物:《Computers, Materials & Continua》 (计算机、材料和连续体(英文))
年 卷 期:2023年第76卷第7期
页 面:195-220页
核心收录:
学科分类:1002[医学-临床医学] 100214[医学-肿瘤学] 10[医学]
主 题:Sentiment analysis patient experience healthcare Twitter MARBERT bidirectional long short-term memory support vector machine transformer-based learning deep learning
摘 要:Healthcare organizations rely on patients’feedback and experiences to evaluate their performance and services,thereby allowing such organizations to improve inadequate services and address any *** to the literature,social networks and particularly Twitter are effective platforms for gathering public ***,recent studies have used natural language processing to measure sentiments in text segments collected from Twitter to capture public opinions about various sectors,including *** present study aimed to analyze Arabic Twitter-based patient experience sentiments and to introduce an Arabic patient experience *** authors collected 12,400 tweets from Arabic patients discussing patient experiences related to healthcare organizations in Saudi Arabia from 1 January 2008 to 29 January *** tweets were labeled according to sentiment(positive or negative)and sector(public or private),and thereby the Hospital Patient Experiences in Saudi Arabia(HoPE-SA)dataset was produced.A simple statistical analysis was conducted to examine differences in patient views of healthcare *** authors trained five models to distinguish sentiments in tweets automatically with the following schemes:a transformer-based model fine-tuned with deep learning architecture and a transformer-based model fine-tuned with simple architecture,using two different transformer-based embeddings based on Bidirectional Encoder Representations from Transformers(BERT),Multi-dialect Arabic BERT(MAR-BERT),and multilingual BERT(mBERT),as well as a pretrained word2vec model with a support vector machine *** is the first study to investigate the use of a bidirectional long short-term memory layer followed by a feedforward neural network for the fine-tuning of *** deep-learning fine-tuned MARBERT-based model—the authors’best-performing model—achieved accuracy,micro-F1,and macro-F1 scores of 98.71%,98.73%,and 98.63%,respectively.