Despite the effectiveness of utilizing the BERT model for document ranking, the high computational cost of such approaches limits their uses. To this end, this paper first empirically investigates the effectiveness of two knowledge distillation models on the document ranking task. In addition, on top of the recently proposed TinyBERT model, two simplifications are proposed. Evaluations on two different and widely-used benchmarks demonstrate that Simplified TinyBERT with the proposed simplifications not only boosts TinyBERT, but also significantly outperforms BERT-Base when providing 15$\times$ speedup.
翻译:尽管在文件排位方面使用BERT模式是有效的,但这类方法的高计算成本限制了其使用。为此,本文件首先从经验上调查了文件排位任务两个知识提炼模型的有效性。此外,除了最近提议的TinyBERT模型之外,还提出了两个简化建议。对两个不同和广泛使用的基准的评价表明,采用拟议的简化简化简化简化了TinyBERT, 不仅促进了TinyBERT, 而且在提供15,000美元时大大超过了BERT-Base。