Continuous DR-submodular functions are a class of generally non-convex/non-concave functions that satisfy the Diminishing Returns (DR) property, which implies that they are concave along non-negative directions. Existing work has studied monotone continuous DR-submodular maximization subject to a convex constraint and provided efficient algorithms with approximation guarantees. In many applications, such as computing the stability number of a graph, the monotone DR-submodular objective function has the additional property of being strongly concave along non-negative directions (i.e., strongly DR-submodular). In this paper, we consider a subclass of $L$-smooth monotone DR-submodular functions that are strongly DR-submodular and have a bounded curvature, and we show how to exploit such additional structure to obtain faster algorithms with stronger guarantees for the maximization problem. We propose a new algorithm that matches the provably optimal $1-\frac{c}{e}$ approximation ratio after only $\lceil\frac{L}{\mu}\rceil$ iterations, where $c\in[0,1]$ and $\mu\geq 0$ are the curvature and the strong DR-submodularity parameter. Furthermore, we study the Projected Gradient Ascent (PGA) method for this problem, and provide a refined analysis of the algorithm with an improved $\frac{1}{1+c}$ approximation ratio (compared to $\frac{1}{2}$ in prior works) and a linear convergence rate. Experimental results illustrate and validate the efficiency and effectiveness of our proposed algorithms.
翻译:DR- 连续的 DR- Submodal 函数是一个一般非 convex / non concive 函数的类别, 它满足了 diminish 返回 (DR) 属性, 这意味着它们与非负向相融合。 现有工作已经研究了单调连续 DR- submodal 最大化, 受 convex 制约, 并提供有效的算法保证。 在许多应用程序中, 比如计算图的稳定性数, 单调 DR- submodal 目标函数具有与非最小化方向( e. 强度DR- submove 返回( DR) 属性) 的强烈共和共性。 我们建议一种新的算法, 与最优化的 美元( 美元) 和 美元( 美元) 的精度( 美元) 的精度和 美元( 美元) 之前的精度分析比值( $\\\\\\\\\ 美元) 美元( 美元) 美元( 美元) 美元( 美元) 美元( 美元) 美元和美元) 美元( 美元) 美元( 美元) 美元) 美元) 的精度分析后, 我们的精度和美元的精度的精度( 的精度) 的精度( 的精度) 。