Best subset selection is considered the `gold standard' for many sparse learning problems. A variety of optimization techniques have been proposed to attack this non-convex and NP-hard problem. In this paper, we investigate the dual forms of a family of $\ell_0$-regularized problems. An efficient primal-dual method has been developed based on the primal and dual problem structures. By leveraging the dual range estimation along with the incremental strategy, our algorithm potentially reduces redundant computation and improves the solutions of best subset selection. Theoretical analysis and experiments on synthetic and real-world datasets validate the efficiency and statistical properties of the proposed solutions.
翻译:最佳子集选择被认为是许多稀少的学习问题的“黄金标准”,提出了各种优化技术,以应对这种非混凝土和NP-硬性的问题。在本文中,我们调查了以美元=0美元为固定问题的家庭的双重形式。根据原始和双重问题结构开发了一种有效的原始-双重方法。通过利用双重范围估算和递增战略,我们的算法有可能减少多余的计算,改进最佳子集选择的解决方案。关于合成和现实世界数据集的理论分析和实验证实了拟议解决方案的效率和统计特性。