In this paper, we consider a class of sparse regression problems, whose objective function is the summation of a convex loss function and a cardinality penalty. By constructing a smoothing function for the cardinality function, we propose a projected neural network and design a correction method for solving this problem. The solution of the proposed neural network is unique, global existent, bounded and globally Lipschitz continuous. Besides, we prove that all accumulation points of the proposed neural network have a common support set and a unified lower bound for the nonzero entries. Combining the proposed neural network with the correction method, any corrected accumulation point is a local minimizer of the considered sparse regression problem. Moreover, we analyze the equivalent relationship on the local minimizers between the considered sparse regression problem and another regression sparse problem. Finally, some numerical experiments are provided to show the efficiency of the proposed neural networks in solving some sparse regression problems in practice.
翻译:在本文中,我们考虑的是一个稀少的回归问题类别,其客观功能是将曲线损失功能和基本惩罚相提并论。通过为基本功能构建一个平滑功能,我们提出了一个预测神经网络,并设计了解决这一问题的纠正方法。拟议的神经网络的解决方案是独特的、全球性的、全球性的、捆绑的和全球性的Lipschitz连续性的。此外,我们证明,拟议的神经网络的所有积累点都有一个共同的支持组,和非零条目的一致下限。将拟议的神经网络与纠正方法结合起来,任何纠正的积累点都是被认为是稀薄回归问题的一个局部最小化点。此外,我们分析了当地最小化器的同等关系,即考虑的稀薄回归问题和另一个回归稀疏问题之间的同等关系。最后,我们提供了一些数字实验,以表明拟议的神经网络在解决实践中一些稀少的回归问题方面的效率。