This paper discusses a special kind of a simple yet possibly powerful algorithm, called single-kernel Gradraker (SKG), which is an adaptive learning method predicting unknown nodal values in a network using known nodal values and the network structure. We aim to find out how to configure the special kind of the model in applying the algorithm. To be more specific, we focus on SKG with a Gaussian kernel and specify how to find a suitable variance for the kernel. To do so, we introduce two variables with which we are able to set up requirements on the variance of the Gaussian kernel to achieve (near-) optimal performance and can better understand how SKG works. Our contribution is that we introduce two variables as analysis tools, illustrate how predictions will be affected under different Gaussian kernels, and provide an algorithm finding a suitable Gaussian kernel for SKG with knowledge about the training network. Simulation results on real datasets are provided.
翻译:本文讨论一种特殊而可能强大的简单算法,称为单内核格拉德拉克(SKG),这是一种适应性学习方法,用已知的节点值和网络结构预测网络中未知的节点值。我们的目标是找出如何配置应用算法的特殊模型类型。更具体地说,我们侧重于高山内核的SKG,并具体说明如何为内核找到合适的差异。为了做到这一点,我们引入了两个变量,我们可以据此为高斯内核的差异设定要求,以达到(近)最佳性能,并更好地了解SKG如何运作。我们的贡献是,我们引入两个变量作为分析工具,说明不同高斯内核下的预测将如何受到影响,并提供算法,为SKG找到一个合适的高斯内核内核,并了解培训网络。我们提供了真实数据集的模拟结果。