Recent advances in cross-lingual commonsense reasoning (CSR) are facilitated by the development of multilingual pre-trained models (mPTMs). While mPTMs show the potential to encode commonsense knowledge for different languages, transferring commonsense knowledge learned in large-scale English corpus to other languages is challenging. To address this problem, we propose the attention-based Cross-LIngual Commonsense Knowledge transfER (CLICKER) framework, which minimizes the performance gaps between English and non-English languages in commonsense question-answering tasks. CLICKER effectively improves commonsense reasoning for non-English languages by differentiating non-commonsense knowledge from commonsense knowledge. Experimental results on public benchmarks demonstrate that CLICKER achieves remarkable improvements in the cross-lingual CSR task for languages other than English.
翻译:发展多语文预先培训模式有助于最近在跨语文共同见解推理(CSR)方面取得进展。虽然MPTMs显示有可能将不同语文的普通思想知识编码,但将大规模英文文集所学的普通思想知识转让给其他语文却具有挑战性。为解决这一问题,我们建议采用注重的跨语文共同语言知识转换框架,以尽量减少英语和非英语在共同语言问答工作中的绩效差距。CLICCER通过区分非普通知识与普通知识,有效地改进非英语的普通思想推理。关于公共基准的实验结果表明,CLICCER在英语以外语文的跨语文CSR任务方面取得了显著的改进。</s>