An important goal of AutoML is to automate-away the design of neural networks on new tasks in under-explored domains. Motivated by this goal, we study the problem of enabling users to discover the right neural operations given data from their specific domain. We introduce a search space of operations called XD-Operations that mimic the inductive bias of standard multi-channel convolutions while being much more expressive: we prove that it includes many named operations across multiple application areas. Starting with any standard backbone such as ResNet, we show how to transform it into a search space over XD-operations and how to traverse the space using a simple weight-sharing scheme. On a diverse set of tasks -- solving PDEs, distance prediction for protein folding, and music modeling -- our approach consistently yields models with lower error than baseline networks and often even lower error than expert-designed domain-specific approaches.
翻译:AutoML 的一个重要目标是将神经网络的设计自动化,使其在探索不足的域域的新任务上自动设计。 我们受此目标的驱动,研究使用户能够发现来自其特定域的数据的正确神经操作的问题。 我们引入了名为XD-操作的搜索空间,这种操作模仿标准多通道连动的感应偏差,同时更能表达: 我们证明它包括了多个应用区的许多有名的操作。 从ResNet等任何标准主干网开始, 我们展示了如何将它转化为XD操作的搜索空间, 以及如何利用简单的权重共享计划绕过空间。 在一系列不同的任务上, 解决 PDE 、 蛋白折叠的距离预测 和音乐模型, 我们的方法一贯产生比基线网络差得多的模型, 并且往往比专家设计的域特定方法差得多。