This work studies a novel subset selection problem called max-min diversification with monotone submodular utility ($\textsf{MDMS}$), which has a wide range of applications in machine learning, e.g., data sampling and feature selection. Given a set of points in a metric space, the goal of $\textsf{MDMS}$ is to maximize $f(S) = g(S) + \lambda \cdot \texttt{div}(S)$ subject to a cardinality constraint $|S| \le k$, where $g(S)$ is a monotone submodular function and $\texttt{div}(S) = \min_{u,v \in S : u \ne v} \text{dist}(u,v)$ is the max-min diversity objective. We propose the $\texttt{GIST}$ algorithm, which gives a $\frac{1}{2}$-approximation guarantee for $\textsf{MDMS}$ by approximating a series of maximum independent set problems with a bicriteria greedy algorithm. We also prove that it is NP-hard to approximate within a factor of $0.5584$. Finally, we show in our empirical study that $\texttt{GIST}$ outperforms state-of-the-art benchmarks for a single-shot data sampling task on ImageNet.
翻译:暂无翻译