Recently, there has been significant progress in studying neural networks to translate text descriptions into SQL queries. Despite achieving good performance on some public benchmarks, existing text-to-SQL models typically rely on the lexical matching between words in natural language (NL) questions and tokens in table schemas, which may render the models vulnerable to attacks that break the schema linking mechanism. In this work, we investigate the robustness of text-to-SQL models to synonym substitution. In particular, we introduce Spider-Syn, a human-curated dataset based on the Spider benchmark for text-to-SQL translation. NL questions in Spider-Syn are modified from Spider, by replacing their schema-related words with manually selected synonyms that reflect real-world question paraphrases. We observe that the accuracy dramatically drops by eliminating such explicit correspondence between NL questions and table schemas, even if the synonyms are not adversarially selected to conduct worst-case adversarial attacks. Finally, we present two categories of approaches to improve the model robustness. The first category of approaches utilizes additional synonym annotations for table schemas by modifying the model input, while the second category is based on adversarial training. We demonstrate that both categories of approaches significantly outperform their counterparts without the defense, and the first category of approaches are more effective.


翻译:最近,在研究神经网络以将文字描述转换成SQL查询方面,取得了显著进展。尽管在一些公共基准上取得了良好业绩,但现有的文本到SQL模型通常依赖自然语言(NL)问题和表方符号之间的词汇匹配,这会使模型容易受到破坏Schema连接机制的攻击。在这项工作中,我们调查了文本到SQL模型对于同义替代的稳健性。特别是,我们引入了基于文本到SQL翻译蜘蛛基准的人类校准数据集Spider-Syyn。 Spiders-Syn中的NL问题从蜘蛛中修改,用手动选定的反映真实世界问题参数的同义词替换其与形式有关的问题。我们发现,由于消除了NL问题和表方程式之间的这种明确对应关系,准确性急剧下降,即使同义词不是被选择来进行最坏的对称攻击。最后,我们提出了两类改进模型坚固度的方法,从蜘蛛到Spids-Syn的类中作了修改。第一种方法,而以更精确的表格的对应性方法都以新的格式为基础。我们用新的表格的表格来展示了其他方法。

0
下载
关闭预览

相关内容

专知会员服务
59+阅读 · 2021年2月16日
Linux导论,Introduction to Linux,96页ppt
专知会员服务
77+阅读 · 2020年7月26日
【快讯】CVPR2020结果出炉,1470篇上榜, 你的paper中了吗?
最新BERT相关论文清单,BERT-related Papers
专知会员服务
52+阅读 · 2019年9月29日
已删除
将门创投
4+阅读 · 2019年8月22日
Arxiv
0+阅读 · 2021年8月21日
Arxiv
3+阅读 · 2019年8月19日
VIP会员
相关资讯
已删除
将门创投
4+阅读 · 2019年8月22日
Top
微信扫码咨询专知VIP会员