We showed how to use trained neural networks to perform Bayesian reasoning in order to solve tasks outside their initial scope. Deep generative models provide prior knowledge, and classification/regression networks impose constraints. The tasks at hand were formulated as Bayesian inference problems, which we approximately solved through variational or sampling techniques. The approach built on top of already trained networks, and the addressable questions grew super-exponentially with the number of available networks. In its simplest form, the approach yielded conditional generative models. However, multiple simultaneous constraints constitute elaborate questions. We compared the approach to specifically trained generators, showed how to solve riddles, and demonstrated its compatibility with state-of-the-art architectures.
翻译:我们展示了如何使用经过训练的神经网络来进行贝叶西亚推理,以解决其初始范围以外的任务。深基因模型提供了先前的知识,分类/递减网络带来了限制。手头的任务被写成贝叶斯推理问题,我们通过变异或取样技术大致解决了这些问题。在已经受过训练的网络之上建立的方法,以及可解决的问题随着现有网络的数量而变得超尖端化。最简单的方式是,该方法产生了有条件的基因模型。然而,多重同时制约构成了复杂的问题。我们比较了专门培训的发电机的方法,展示了如何解决谜题,并展示了它与最新结构的兼容性。