We propose a new fast algorithm to estimate any sparse generalized linear model with convex or non-convex separable penalties. Our algorithm is able to solve problems with millions of samples and features in seconds, by relying on coordinate descent, working sets and Anderson acceleration. It handles previously unaddressed models, and is extensively shown to improve state-of-art algorithms. We provide a flexible, scikit-learn compatible package, which easily handles customized datafits and penalties.
翻译:我们提出了一个新的快速算法,以估计任何稀有的通用线性模型,并配有可分解或不可分解的处罚。我们的算法能够依靠协调下行、工作组和安德森加速,在秒内解决数百万个样本和特征的问题。它处理以前未解决的模型,并被广泛证明可以改进最先进的算法。我们提供了一个灵活、微缩的兼容软件包,很容易处理定制的数据和惩罚。</s>