Deep neural networks (DNNs) have shown superior performances on various multimodal learning problems. However, it often requires huge efforts to adapt DNNs to individual multimodal tasks by manually engineering unimodal features and designing multimodal feature fusion strategies. This paper proposes Bilevel Multimodal Neural Architecture Search (BM-NAS) framework, which makes the architecture of multimodal fusion models fully searchable via a bilevel searching scheme. At the upper level, BM-NAS selects the inter/intra-modal feature pairs from the pretrained unimodal backbones. At the lower level, BM-NAS learns the fusion strategy for each feature pair, which is a combination of predefined primitive operations. The primitive operations are elaborately designed and they can be flexibly combined to accommodate various effective feature fusion modules such as multi-head attention (Transformer) and Attention on Attention (AoA). Experimental results on three multimodal tasks demonstrate the effectiveness and efficiency of the proposed BM-NAS framework. BM-NAS achieves competitive performances with much less search time and fewer model parameters in comparison with the existing generalized multimodal NAS methods.
翻译:深神经网络(DNNS)在多种多式联运学习问题上表现优异,但往往需要做出巨大努力,通过人工工程单式特点和设计多式联运组合战略,使DNNS适应单式多式联运任务,本文件提出双级多式神经结构搜索(BM-NAS)框架,使多式聚合模型的结构可以通过双级搜索计划完全搜索。在上一级,BM-NAS从经过预先训练的单式骨干中选择跨式/内部特征对。在较低一级,BM-NAS学习每种功能组合战略,这是预先界定的原始作业的组合。原始作业是精心设计的,可以灵活地结合,以适应多种有效的特性融合模块,如多头目关注(Transtorent)和关注(AoA)等。在三个多式联运任务上的实验结果表明拟议的BM-NAS框架的有效性和效率。BM-NAS在较低层次上取得了竞争性业绩,搜索时间少得多,与现有的通用的ASNAS方法相比,模型参数更少。