This paper investigates the efficient solution of penalized quadratic regressions in high-dimensional settings. We propose a novel and efficient algorithm for ridge-penalized quadratic regression that leverages the matrix structures of the regression with interactions. Building on this formulation, we develop an alternating direction method of multipliers (ADMM) framework for penalized quadratic regression with general penalties, including both single and hybrid penalty functions. Our approach greatly simplifies the calculations to basic matrix-based operations, making it appealing in terms of both memory storage and computational complexity.
翻译:暂无翻译