Facial Action Units (AUs) represent a set of facial muscular activities and various combinations of AUs can represent a wide range of emotions. AU recognition is often used in many applications, including marketing, healthcare, education, and so forth. Although a lot of studies have developed various methods to improve recognition accuracy, it still remains a major challenge for AU recognition. In the Affective Behavior Analysis in-the-wild (ABAW) 2020 competition, we proposed a new automatic Action Units (AUs) recognition method using a pairwise deep architecture to derive the Pseudo-Intensities of each AU and then convert them into predicted intensities. This year, we introduced a new technique to last year's framework to further reduce AU recognition errors due to temporary face occlusion such as hands on face or large face orientation. We obtained a score of 0.65 in the validation data set for this year's competition.
翻译:面部肌肉活动股(AUs)代表着一套面部肌肉活动,各种组合的AUs可以代表广泛的情感。非盟的承认常常用于许多应用,包括营销、医疗保健、教育等。尽管许多研究已经开发出提高认知准确性的各种方法,但对于非盟的承认来说,这仍然是一个重大挑战。在(ABAAW)2020年老牌的Affective Behavior分析(ABAW)竞赛中,我们提出了一个新的自动行动股(AUs)识别方法,使用一种双向深层结构来生成每个AU的优多-强度,然后将其转换为预测强度。今年,我们为去年的框架引入了一种新的技术,以进一步减少非盟因面部或大面部等暂时隔离而产生的识别错误。我们在为今年竞争设定的验证数据中得分为0.65分。