Composite-database micro-expression recognition is attracting increasing attention as it is more practical to real-world applications. Though the composite database provides more sample diversity for learning good representation models, the important subtle dynamics are prone to disappearing in the domain shift such that the models greatly degrade their performance, especially for deep models. In this paper, we analyze the influence of learning complexity, including the input complexity and model complexity, and discover that the lower-resolution input data and shallower-architecture model are helpful to ease the degradation of deep models in composite-database task. Based on this, we propose a recurrent convolutional network (RCN) to explore the shallower-architecture and lower-resolution input data, shrinking model and input complexities simultaneously. Furthermore, we develop three parameter-free modules (i.e., wide expansion, shortcut connection and attention unit) to integrate with RCN without increasing any learnable parameters. These three modules can enhance the representation ability in various perspectives while preserving not-very-deep architecture for lower-resolution data. Besides, three modules can further be combined by an automatic strategy (a neural architecture search strategy) and the searched architecture becomes more robust. Extensive experiments on MEGC2019 dataset (composited of existing SMIC, CASME II and SAMM datasets) have verified the influence of learning complexity and shown that RCNs with three modules and the searched combination outperform the state-of-the-art approaches.
翻译:综合数据库为学习良好的代表性模型提供了更多的样本多样性,但重要的微妙动态在域变中很容易消失,使模型的性能大大降低,特别是深层模型的性能。在本文中,我们分析学习复杂性的影响,包括输入复杂性和模型复杂性,发现低分辨率输入数据和浅层结构模型有助于缓解复合数据库任务中深层模型的退化。在此基础上,我们提议建立一个经常性的卷变网络(RCN),以探索浅层和低分辨率输入数据,同时缩小模型和输入复杂性。此外,我们开发了三个无参数模块(即,广泛扩展、捷径连接和关注单元),以便在不增加任何可学习参数的情况下与RCN整合。这三个模块可以提高各种视角的代表性,同时维护低分辨率数据库任务中不甚深层模型的退化。此外,三个模块还可以通过自动战略(NRC结构搜索模型)和低分辨率输入数据的低分辨率数据数据数据数据数据转换模型(MSMIS)进行更强的搜索模型和SMICA系统模型组合。