The rapid growth of digital commerce has led to the accumulation of a massive number of consumer reviews on online platforms. Shopee, as one of the largest e-commerce platforms in Southeast Asia, receives millions of product reviews every day containing valuable information regarding customer satisfaction and preferences. Manual analysis of these reviews is inefficient, thus requiring a computational approach such as sentiment analysis. This study examines the use of DistilBERT, a lightweight transformer-based deep learning model, for sentiment classification on Shopee product reviews. The dataset used consists of approximately one million English-language reviews that have been preprocessed and trained using the distilbert-base-uncased model. Evaluation was conducted using accuracy, precision, recall, and F1-score metrics, and compared against benchmark models such as BERT and SVM. The results show that DistilBERT achieved an accuracy of 94.8%, slightly below BERT (95.3%) but significantly higher than SVM (90.2%), with computation time reduced by more than 55%. These findings demonstrate that DistilBERT provides an optimal balance between accuracy and efficiency, making it suitable for large scale sentiment analysis on e-commerce platforms. Keywords: Sentiment Analysis, DistilBERT, Shopee Reviews, Natural Language Processing, Deep Learning, Transformer Models.
翻译:数字商务的快速发展导致在线平台积累了海量的消费者评论。Shopee作为东南亚最大的电子商务平台之一,每天接收数百万条包含客户满意度和偏好等有价值信息的产品评论。对这些评论进行人工分析效率低下,因此需要采用情感分析等计算方法。本研究探讨了使用基于Transformer的轻量级深度学习模型DistilBERT对Shopee产品评论进行情感分类的效果。所用数据集包含约一百万条经过预处理的英文评论,并采用distilbert-base-uncased模型进行训练。评估使用准确率、精确率、召回率和F1分数等指标,并与BERT和SVM等基准模型进行比较。结果显示,DistilBERT实现了94.8%的准确率,略低于BERT(95.3%),但显著高于SVM(90.2%),且计算时间减少了55%以上。这些发现表明,DistilBERT在准确性和效率之间提供了最佳平衡,适用于电子商务平台的大规模情感分析。关键词:情感分析、DistilBERT、Shopee评论、自然语言处理、深度学习、Transformer模型。