Making an adaptive prediction based on one's input is an important ability for general artificial intelligence. In this work, we step forward in this direction and propose a semi-parametric method, Meta-Neighborhoods, where predictions are made adaptively to the neighborhood of the input. We show that Meta-Neighborhoods is a generalization of $k$-nearest-neighbors. Due to the simpler manifold structure around a local neighborhood, Meta-Neighborhoods represent the predictive distribution $p(y \mid x)$ more accurately. To reduce memory and computation overhead, we propose induced neighborhoods that summarize the training data into a much smaller dictionary. A meta-learning based training mechanism is then exploited to jointly learn the induced neighborhoods and the model. Extensive studies demonstrate the superiority of our method.
翻译:基于个人投入的适应性预测是一般人工智能的重要能力。 在这项工作中,我们朝这个方向前进,并提出半参数方法Meta-neighborous, 即Meta- neighborouss, 即根据输入的周边进行适应性预测。 我们显示Meta- neighborbors是美元- neighborbors的概括性。 由于当地邻居周围的简单多重结构, Meta- neighborborss 更准确地代表了预测性分布$p(y\midxx) 。 为了减少记忆和计算间接费用, 我们提出将培训数据汇总成一个小得多的字典的诱导社区。 然后利用一个基于元学习的培训机制来共同学习导出的邻居和模型。 广泛的研究显示了我们方法的优越性。