We propose a new method for training a supervised source separation system that aims to learn the interdependent relationships between all combinations of sources in a mixture. Rather than independently estimating each source from a mix, we reframe the source separation problem as an Orderless Neural Autoregressive Density Estimator (NADE), and estimate each source from both the mix and a random subset of the other sources. We adapt a standard source separation architecture, Demucs, with additional inputs for each individual source, in addition to the input mixture. We randomly mask these input sources during training so that the network learns the conditional dependencies between the sources. By pairing this training method with a block Gibbs sampling procedure at inference time, we demonstrate that the network can iteratively improve its separation performance by conditioning a source estimate on its earlier source estimates. Experiments on two source separation datasets show that training a Demucs model with an Orderless NADE approach and using Gibbs sampling (up to 512 steps) at inference time strongly outperforms a Demucs baseline that uses a standard regression loss and direct (one step) estimation of sources.
翻译:我们提出一种新的方法,用于培训监督源分离系统,目的是学习混合物中所有来源组合之间相互依存的关系。我们不独立地从混合中估算每种来源,而是将源分离问题重新设定为无秩序神经自动反向密度模拟器(NADE),并从混合体和其他来源的一个随机子集中估算每一种来源。我们调整了一个标准源分离结构Demucs,除了输入混合物之外,还给每个来源增加投入。我们在培训期间随机遮盖这些输入源,以便网络了解来源之间的有条件依赖性。我们通过在推断时间将这一培训方法与块块Gibbs取样程序配对,我们证明网络可以通过调整来源先前估计数来迭接地改进其分离性能。对两个源分离数据集的实验表明,在推断时间里,对Demucs模型的培训采用无秩序的NADE方法,并使用Gibs抽样(最多512步),大大超出使用标准回归损失和直接(一步)估算源的Demucs基线。