While neural architecture search methods have been successful in previous years and led to new state-of-the-art performance on various problems, they have also been criticized for being unstable, being highly sensitive with respect to their hyperparameters, and often not performing better than random search. To shed some light on this issue, we discuss some practical considerations that help improve the stability, efficiency and overall performance.
翻译:虽然神经结构搜索方法在过去几年中取得了成功,并导致在各种问题上出现了新的最新表现,但人们也批评这些方法不稳定,对超参数高度敏感,而且往往没有比随机搜索更好的表现。 为了对这一问题有所了解,我们讨论了一些有助于改善稳定性、效率和总体绩效的实际考虑。