We introduce a new optimization algorithm, termed \emph{contrastive adjustment}, for learning Markov transition kernels whose stationary distribution matches the data distribution. Contrastive adjustment is not restricted to a particular family of transition distributions and can be used to model data in both continuous and discrete state spaces. Inspired by recent work on noise-annealed sampling, we propose a particular transition operator, the \emph{noise kernel}, that can trade mixing speed for sample fidelity. We show that contrastive adjustment is highly valuable in human-computer design processes, as the stationarity of the learned Markov chain enables local exploration of the data manifold and makes it possible to iteratively refine outputs by human feedback. We compare the performance of noise kernels trained with contrastive adjustment to current state-of-the-art generative models and demonstrate promising results on a variety of image synthesis tasks.
翻译:我们引入了一种新的优化算法,称为 emph{ contratration addiction}, 用于学习与数据分布相匹配的 Markov 过渡内核。 对比性调整并不局限于一个过渡分布的大家庭,而是可用于在连续和离散的州空间进行数据模型化。 受最近关于噪音保护取样工作启发, 我们建议一个特定的过渡操作器, 即 emph{ noise 内核, 以混合速度交换样本忠诚性 。 我们表明, 对比性调整在人类计算机设计过程中非常宝贵, 因为学习的Markov 链的固定性使得能够在当地探索数据元, 并使得能够通过人类反馈反复地改进输出。 我们比较了经过对比调整的噪音内核的性能, 并展示了在各种图像合成任务上很有希望的结果 。</s>