This paper describes an approach to the facial action unit (AU) detection. In this work, we present our submission to the Field Affective Behavior Analysis (ABAW) 2021 competition. The proposed method uses the pre-trained JAA model as the feature extractor, and extracts global features, face alignment features and AU local features on the basis of multi-scale features. We take the AU local features as the input of the graph convolution to further consider the correlation between AU, and finally use the fused features to classify AU. The detected accuracy was evaluated by 0.5*accuracy + 0.5*F1. Our model achieves 0.674 on the challenging Aff-Wild2 database.
翻译:本文介绍了面部行动股(AU)检测方法。在这项工作中,我们向实地积极行为分析(ABAW) 2021 竞赛提交我们的报告。拟议方法使用预先培训的JAA模型作为特征提取器,并根据多尺度特征提取全球特征、面部对齐特征和非盟地方特征。我们将非盟地方特征作为图变的投入,以进一步考虑非盟之间的关联,并最终使用引信特征对非盟进行分类。检测的准确性由0.5*acurecy + 0.5*F1评估。我们的模式在具有挑战性的Aff-Wild2数据库上实现了0.674。