In this paper, we propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks. Unlike the conventional NAS problem, our problem requires to search for a unique computational cell called the SPD cell. This SPD cell serves as a basic building block of SPD neural architectures. An efficient solution to our problem is important to minimize the extraneous manual effort in the SPD neural architecture design. To accomplish this goal, we first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design. Further, we model our new NAS problem using the supernet strategy, which models the architecture search problem as a one-shot training process of a single supernet. Based on the supernet modeling, we exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search. Statistical evaluation of our method on drone, action, and emotion recognition tasks mostly provides better results than the state-of-the-art SPD networks and NAS algorithms. Empirical results show that our algorithm excels in discovering better SPD network design and providing models that are more than 3 times lighter than searched by state-of-the-art NAS algorithms.
翻译:在本文中, 我们提出一个新的神经结构搜索( NAS) 问题, 即对正偏偏( SPD) 多重网络 。 与常规的NAS 问题不同, 我们的问题需要寻找一个独特的计算细胞, 称为 SPD 细胞。 这个 SPD 细胞是 SPD 神经结构的基本构件。 高效解决我们的问题对于最大限度地减少 SPD 神经结构设计中外部人工工作的重要性。 为了实现这一目标, 我们首先引入一个几何上丰富和多样化的 SPD 神经结构搜索空间, 以高效的 SPD 细胞设计。 此外, 我们用超级网络战略来模拟我们的新NAS 问题, 这个战略将建筑搜索问题作为单一超级网络的一次性培训过程。 基于超级网络模型, 我们利用一种不同的NAS 算法来利用我们为 SPD 神经结构搜索的宽松的连续搜索空间。 对我们的无人机、 行动、 和情绪识别任务的方法进行统计评估, 其结果大多比 州- 州- SPD 网络 和 NAS 算算法 还要好。 Epal 结果显示, 我们的算算法比州- 搜索比州- 州- SAD 更轻, SAD 的搜索 3 系统更轻。