马尔可夫链,因安德烈·马尔可夫(A.A.Markov,1856-1922)得名,是指数学中具有马尔可夫性质的离散事件随机过程。该过程中,在给定当前知识或信息的情况下,过去(即当前以前的历史状态)对于预测将来(即当前以后的未来状态)是无关的。 在马尔可夫链的每一步,系统根据概率分布,可以从一个状态变到另一个状态,也可以保持当前状态。状态的改变叫做转移,与不同的状态改变相关的概率叫做转移概率。随机漫步就是马尔可夫链的例子。随机漫步中每一步的状态是在图形中的点,每一步可以移动到任何一个相邻的点,在这里移动到每一个点的概率都是相同的(无论之前漫步路径是如何的)。

VIP内容

题目: Probabilistic Logic Neural Networks for Reasoning

摘要:

知识图谱推理的目的是通过对观测到的事实进行推理来预测缺失的事实,它在许多应用中起着至关重要的作用。传统的基于逻辑规则的方法和近年来的知识图谱嵌入方法都对这一问题进行了广泛的探讨。马尔可夫逻辑网络(MLN)是一种有原则的基于规则的逻辑方法,它能够利用一阶逻辑的领域知识,同时处理不确定性。然而,由于其复杂的图形结构,MLNs的推理通常是非常困难的。与MLNs不同的是,知识图的嵌入方法(如TransE、DistMult)学习有效的实体嵌入和关系嵌入进行推理,这样更有效、更高效。然而,他们无法利用领域知识。在本文中,我们提出了概率逻辑神经网络(pLogicNet),它结合了两种方法的优点。pLogicNet使用一阶逻辑的马尔可夫逻辑网络定义所有可能的三联体的联合分布,该网络可以通过变分EM算法进行有效优化。采用知识图谱嵌入模型推断缺失的三联体,根据观测到的三联体和预测到的三联体更新逻辑规则权值。在多个知识图谱的实验证明了pLogicNet在许多竞争基线上的有效性。

作者:

瞿锰是蒙特利尔学习算法研究所的一年级博士生,之前,在伊利诺伊大学香槟分校获得了硕士学位,此外,在北京大学获得了学士学位。主要研究方向为机器学习、贝叶斯深度学习、数据挖掘和自然语言处理。

成为VIP会员查看完整内容
0
81

最新内容

Hamiltonian Monte Carlo (HMC) is a popular Markov Chain Monte Carlo (MCMC) algorithm to sample from an unnormalized probability distribution. A leapfrog integrator is commonly used to implement HMC in practice, but its performance can be sensitive to the choice of mass matrix used therein. We develop a gradient-based algorithm that allows for the adaptation of the mass matrix by encouraging the leapfrog integrator to have high acceptance rates while also exploring all dimensions jointly. In contrast to previous work that adapt the hyperparameters of HMC using some form of expected squared jumping distance, the adaptation strategy suggested here aims to increase sampling efficiency by maximizing an approximation of the proposal entropy. We illustrate that using multiple gradients in the HMC proposal can be beneficial compared to a single gradient-step in Metropolis-adjusted Langevin proposals. Empirical evidence suggests that the adaptation method can outperform different versions of HMC schemes by adjusting the mass matrix to the geometry of the target distribution and by providing some control on the integration time.

0
0
下载
预览

最新论文

Hamiltonian Monte Carlo (HMC) is a popular Markov Chain Monte Carlo (MCMC) algorithm to sample from an unnormalized probability distribution. A leapfrog integrator is commonly used to implement HMC in practice, but its performance can be sensitive to the choice of mass matrix used therein. We develop a gradient-based algorithm that allows for the adaptation of the mass matrix by encouraging the leapfrog integrator to have high acceptance rates while also exploring all dimensions jointly. In contrast to previous work that adapt the hyperparameters of HMC using some form of expected squared jumping distance, the adaptation strategy suggested here aims to increase sampling efficiency by maximizing an approximation of the proposal entropy. We illustrate that using multiple gradients in the HMC proposal can be beneficial compared to a single gradient-step in Metropolis-adjusted Langevin proposals. Empirical evidence suggests that the adaptation method can outperform different versions of HMC schemes by adjusting the mass matrix to the geometry of the target distribution and by providing some control on the integration time.

0
0
下载
预览
Top