We study the question of whether submodular functions of random variables satisfying various notions of negative dependence satisfy Chernoff-like concentration inequalities. We prove such a concentration inequality for the lower tail when the random variables satisfy negative association or negative regression, partially resolving an open problem raised in (Qiu and Singla [QS22]). Previous work showed such concentration results for random variables that come from specific dependent-rounding algorithms (Chekuri, Vondrak, and Zenklusen [CVZ10] and Harvey and Olver [HO14]). We discuss some applications of our results to combinatorial optimization and beyond. We also show applications to the concentration of read-k families [Gav+15] under certain forms of negative dependence; we further show a simplified proof of the entropy-method approach of [Gav+15].
翻译:暂无翻译