Affective computing has garnered researchers' attention and interest in recent years as there is a need for AI systems to better understand and react to human emotions. However, analyzing human emotions, such as mood or stress, is quite complex. While various stress studies use facial expressions and wearables, most existing datasets rely on processing data from a single modality. This paper presents EmpathicSchool, a novel dataset that captures facial expressions and the associated physiological signals, such as heart rate, electrodermal activity, and skin temperature, under different stress levels. The data was collected from 20 participants at different sessions for 26 hours. The data includes nine different signal types, including both computer vision and physiological features that can be used to detect stress. In addition, various experiments were conducted to validate the signal quality.
翻译:近年来,由于需要人工智能系统更好地了解和应对人类情感,情感计算引起了研究人员的注意和兴趣。然而,分析人类情绪,如情绪或压力,是相当复杂的。尽管各种压力研究使用面部表情和磨损功能,但大多数现有数据集都依赖单一模式的处理数据。本文展示了“同情学校”,这是一个新颖的数据集,在不同压力水平下收集面部表现和相关的生理信号,如心率、电极活动、皮肤温度。这些数据从20名参与者不同会议收集,为期26小时。数据包括9种不同的信号类型,包括计算机视觉和生理特征,可用于检测压力。此外,还进行了各种实验,以验证信号质量。