As one of the most important affective signals, facial affect analysis (FAA) is essential for developing human-computer interaction systems. Early methods focus on extracting appearance and geometry features associated with human affects while ignoring the latent semantic information among individual facial changes, leading to limited performance and generalization. Recent work attempts to establish a graph-based representation to model these semantic relationships and develop frameworks to leverage them for various FAA tasks. This paper provides a comprehensive review of graph-based FAA, including the evolution of algorithms and their applications. First, the FAA background knowledge is introduced, especially on the role of the graph. We then discuss approaches widely used for graph-based affective representation in literature and show a trend towards graph construction. For the relational reasoning in graph-based FAA, existing studies are categorized according to their non-deep or deep learning methods, emphasizing the latest graph neural networks. Performance comparisons of the state-of-the-art graph-based FAA methods are also summarized. Finally, we discuss the challenges and potential directions. As far as we know, this is the first survey of graph-based FAA methods. Our findings can serve as a reference for future research in this field.
翻译:作为最重要的感知信号之一,面部影响分析(FAA)对于开发人-计算机互动系统至关重要。早期方法侧重于提取外观和与人类影响相关的几何特征,同时忽视个人面部变化中潜在的语义信息,导致绩效和概括化。最近的工作试图建立基于图表的表述方法,以模拟这些语义关系,并制订框架,为FAA的各项任务利用这些语义关系。本文件还全面审查了基于图表的FAA,包括算法及其应用的演变情况。首先,引入了FAA背景知识,尤其是关于图的作用。然后我们讨论了文献中基于图表的感知表达方法,并展示了图表构建的趋势。关于基于图表的FAA的关联推理方法,现有研究按其非深入或深入的学习方法进行分类,强调最新的图形神经网络。对基于图表的FAA方法的绩效比较也作了总结。最后,我们讨论了挑战和潜在方向。据我们所知,这是对基于图表的FAA方法的首次调查。我们的研究结果可以作为实地研究的参考。