Learning feature interactions is crucial for click-through rate (CTR) prediction in recommender systems. In most existing deep learning models, feature interactions are either manually designed or simply enumerated. However, enumerating all feature interactions brings large memory and computation cost. Even worse, useless interactions may introduce noise and complicate the training process. In this work, we propose a two-stage algorithm called Automatic Feature Interaction Selection (AutoFIS). AutoFIS can automatically identify important feature interactions for factorization models with computational cost just equivalent to training the target model to convergence. In the \emph{search stage}, instead of searching over a discrete set of candidate feature interactions, we relax the choices to be continuous by introducing the architecture parameters. By implementing a regularized optimizer over the architecture parameters, the model can automatically identify and remove the redundant feature interactions during the training process of the model. In the \emph{re-train stage}, we keep the architecture parameters serving as an attention unit to further boost the performance. Offline experiments on three large-scale datasets (two public benchmarks, one private) demonstrate that AutoFIS can significantly improve various FM based models. AutoFIS has been deployed onto the training platform of Huawei App Store recommendation service, where a 10-day online A/B test demonstrated that AutoFIS improved the DeepFM model by 20.3\% and 20.1\% in terms of CTR and CVR respectively.
翻译:在推荐人系统中, 点击通速率( CTR) 的学习特征互动至关重要。 在大多数现有的深层次学习模型中, 特征互动要么是手工设计, 要么是简单列出。 然而, 列举所有特征互动都会带来大量记忆和计算成本。 更糟糕的是, 毫无用处的互动可能会带来噪音, 使培训过程复杂化。 在这项工作中, 我们提议了一个名为自动特征互动选择( AutuFIS) 的两阶段算法。 AutFIS 可以自动识别和删除两个阶段的重复性特征互动。 在 自动FIS 系统中, 我们保留结构参数作为关注单位, 以进一步提升业绩。 在三个大型数据集( 两个公共基准,一个私有)上进行离散的实验, 我们通过引入架构参数参数参数参数参数参数参数参数参数来放松选择, 来让选择的选项能够持续进行。 通过对架构参数参数参数参数参数进行正规化的优化, 模型可以自动识别并消除模式过程中的重复性特征互动。 ASAFIA 测试 C 20 和FIS 的在线服务平台, 已经通过测试了AFA 10-FIS AS AS- AS- AS- AS- AS- AS- AS- AS- AS- AS- AS AS AS AS- AS- AS- a- a- a- a- a- a- a- a- a- AS- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a- a