This work proposes a novel adaptive linearized alternating direction multiplier method (LADMM) to convex optimization, which improves the convergence rate of the LADMM-based algorithm by adjusting step-size iteratively.The innovation of this method is to utilize the information of the current iteration point to adaptively select the appropriate parameters, thus expanding the selection of the subproblem step size and improving the convergence rate of the algorithm while ensuring convergence.The advantage of this method is that it can improve the convergence rate of the algorithm as much as possible without compromising the convergence. This is very beneficial for the solution of optimization problems because the traditional linearized alternating direction multiplier method has a trade-off in the selection of the regular term coefficients: larger coefficients ensure convergence but tend to lead to small step sizes, while smaller coefficients allow for an increase in the iterative step size but tend to lead to the algorithm's non-convergence. This balance can be better handled by adaptively selecting the parameters, thus improving the efficiency of the algorithm.
翻译:暂无翻译