Previous work optimizes traditional active learning (AL) processes with incremental neural network architecture search (Active-iNAS) based on data complexity change, which improves the accuracy and learning efficiency. However, Active-iNAS trains several models and selects the model with the best generalization performance for querying the subsequent samples after each active learning cycle. The independent training processes lead to an insufferable computational budget, which is significantly inefficient and limits search flexibility and final performance. To address this issue, we propose a novel active strategy with the method called structured variational inference (SVI) or structured neural depth search (SNDS) whereby we could use the gradient descent method in neural network depth search during AL processes. At the same time, we theoretically demonstrate that the current VI-based methods based on the mean-field assumption could lead to poor performance. We apply our strategy using three querying techniques and three datasets and show that our strategy outperforms current methods.
翻译:暂无翻译