The rise of deep neural networks provides an important driver in optimizing recommender systems. However, the success of recommender systems lies in delicate architecture fabrication, and thus calls for Neural Architecture Search (NAS) to further improve its modeling. We propose NASRec, a paradigm that trains a single supernet and efficiently produces abundant models/sub-architectures by weight sharing. To overcome the data multi-modality and architecture heterogeneity challenges in recommendation domain, NASRec establishes a large supernet (i.e., search space) to search the full architectures, with the supernet incorporating versatile operator choices and dense connectivity minimizing human prior for flexibility. The scale and heterogeneity in NASRec impose challenges in search, such as training inefficiency, operator-imbalance, and degraded rank correlation. We tackle these challenges by proposing single-operator any-connection sampling, operator-balancing interaction modules, and post-training fine-tuning. Our results on three Click-Through Rates (CTR) prediction benchmarks show that NASRec can outperform both manually designed models and existing NAS methods, achieving state-of-the-art performance.
翻译:深心神经网络的兴起是优化建议系统的重要驱动力。然而,推荐系统的成功取决于微妙的建筑结构制造,因此,建议系统的成功取决于神经结构搜索(NAS)以进一步改进其建模。我们提议采用NASRec,这是一个培训单一超级网的范例,通过权重共享高效生成大量模型/子结构。为了克服建议领域的数据多式和结构差异性挑战,NASRec建立了一个大型超级网(即搜索空间),以搜索整个结构,其中包含功能性操作者选择和密集连接的超级网,从而在灵活性之前将人类最小化。NASRec的规模和异质性在搜索中提出了挑战,例如效率低、操作者-平衡和等级关系退化的培训。我们通过提出任何连接抽样、操作者-平衡互动模块以及培训后微调,来应对这些挑战。我们关于三个点击-地形率(CTR)预测结果显示,NASRec能够超越手动设计模型和现有NAS绩效方法,实现国家绩效基准。