The usefulness and value of Multi-step Machine Learning (ML), where a task is organized into connected sub-tasks with known intermediate inference goals, as opposed to a single large model learned end-to-end without intermediate sub-tasks, is presented. Pre-optimized ML models are connected and better performance is obtained by re-optimizing the connected one. The selection of an ML model from several small ML model candidates for each sub-task has been performed by using the idea based on Neural Architecture Search (NAS). In this paper, Differentiable Architecture Search (DARTS) and Single Path One-Shot NAS (SPOS-NAS) are tested, where the construction of loss functions is improved to keep all ML models smoothly learning. Using DARTS and SPOS-NAS as an optimization and selection as well as the connections for multi-step machine learning systems, we find that (1) such a system can quickly and successfully select highly performant model combinations, and (2) the selected models are consistent with baseline algorithms, such as grid search, and their outputs are well controlled.
翻译:多步机器学习(ML)的有用性和价值,其中将一项任务组织成具有已知中间推论目标的连接子任务,而不是单一的大型模型,在没有中间子任务的情况下从端到端学习的端到端学习,并介绍了该任务的价值和价值。预优化的ML模型连接在一起,通过重新优化连接模型获得更好的性能。从几个小型ML模型候选人中为每个子任务选择一个ML模型,这是通过使用基于神经结构搜索(NAS)的理念完成的。在本文中,测试了可区分的建筑搜索(DARTS)和单一路径一号Shot NAS(SPOS-NAS),其中损失功能的构建得到了改进,以保持所有ML模型的顺利学习。利用DARTS和SPOS-NAS作为优化和选择以及多步机器学习系统的连接,我们发现(1)这样的系统能够快速和成功地选择高性能模型组合,(2)选定的模型符合基线算法,例如电网搜索,其产出受到很好的控制。