We propose a general Variational Embedding Learning Framework (VELF) for alleviating the severe cold-start problem in CTR prediction. VELF addresses the cold start problem via alleviating over-fits caused by data-sparsity in two ways: learning probabilistic embedding, and incorporating trainable and regularized priors which utilize the rich side information of cold start users and advertisements (Ads). The two techniques are naturally integrated into a variational inference framework, forming an end-to-end training process. Abundant empirical tests on benchmark datasets well demonstrate the advantages of our proposed VELF. Besides, extended experiments confirmed that our parameterized and regularized priors provide more generalization capability than traditional fixed priors.
翻译:我们提出一个通用变式嵌入学习框架(VELF),以缓解CTR预测中严重的冷启动问题。 VELF通过以下两种方式解决冷启动问题:学习概率嵌入,纳入可培训和正规化的前科,利用冷启动用户和广告(Ads)的丰富侧面信息。这两种技术自然地被纳入变式推断框架,形成端对端培训过程。基准数据集的大量实证测试很好地证明了我们提议的VELF的优势。 此外,扩展实验证实,我们的参数化和正规化前科比传统的固定前科更具有概括化能力。