While hyper-parameters (HPs) are important for knowledge graph (KG) learning, existing methods fail to search them efficiently. To solve this problem, we first analyze the properties of different HPs and measure the transfer ability from small subgraph to the full graph. Based on the analysis, we propose an efficient two-stage search algorithm KGTuner, which efficiently explores HP configurations on small subgraph at the first stage and transfers the top-performed configurations for fine-tuning on the large full graph at the second stage. Experiments show that our method can consistently find better HPs than the baseline algorithms within the same time budget, which achieves {9.1\%} average relative improvement for four embedding models on the large-scale KGs in open graph benchmark.
翻译:虽然超参数(HPs)对于知识图(KG)的学习很重要,但现有方法无法有效地搜索它们。为了解决这个问题,我们首先分析不同HPs的特性,并测量从小子子图向完整图的传输能力。根据分析,我们建议一个高效的两阶段搜索算法KGTuner,该算法在第一阶段有效探索小子图上的HP配置,并在第二阶段传输最完善的配置,对大整图进行微调。实验显示,我们的方法能够在同一预算期间始终发现比基线算法更好的HPs,从而在开放图表基准中实现在大型KGs上嵌入四个模型的平均相对改进 {9.1 ⁇ 。