Submodular functions and variants, through their ability to characterize diversity and coverage, have emerged as a key tool for data selection and summarization. Many recent approaches to learn submodular functions suffer from limited expressiveness. In this work, we propose FLEXSUBNET, a family of flexible neural models for both monotone and non-monotone submodular functions. To fit a latent submodular function from (set, value) observations, FLEXSUBNET applies a concave function on modular functions in a recursive manner. We do not draw the concave function from a restricted family, but rather learn from data using a highly expressive neural network that implements a differentiable quadrature procedure. Such an expressive neural model for concave functions may be of independent interest. Next, we extend this setup to provide a novel characterization of monotone \alpha-submodular functions, a recently introduced notion of approximate submodular functions. We then use this characterization to design a novel neural model for such functions. Finally, we consider learning submodular set functions under distant supervision in the form of (perimeter-set, high-value-subset) pairs. This yields a novel subset selection method based on an order-invariant, yet greedy sampler built around the above neural set functions. Our experiments on synthetic and real data show that FLEXSUBNET outperforms several baselines.
翻译:FLEXSUBNET 是一个单调和非单调子模块功能的灵活神经模型组合。 FLEXSUBNET, 适合从( 设置、 价值) 观测得出的潜在子模块功能。 FLEXSUBNET 以循环方式对模块函数应用一个相近的组合函数。我们不从一个受限制的家族中绘制同级函数,而是从数据中学习,使用高清晰的神经网络来实施不同的二次线性程序。在这项工作中,我们提议FLEXSUBNET, 这是一种对单调和非单调子模块功能的柔软性神经模型。为了适应从( 设置、 价值) 观测得出的潜在子模块功能,我们随后用这个特性来设计一个用于这些功能的新型神经模型。最后,我们考虑在远端的合成系统里程里程里程里,在远端的系统里程里程里程里, 显示一个基于我们内部基流里程里程里程里程里程里程里程里程里, 的内程里程里程里程里程里程里程里程里程里程里程里。