ACL 2020 于 5 月 17 日放榜啦,在此祝贺、羡慕每一位论文被接收的朋友。以下汇总并尝试按主题分类了 37 篇与 word embedding 有关的长文,让我们一起看看今年词嵌入的研究又有哪些新进展。
关于作者:张正,坐标巴黎,上班NLP,下班词嵌入。
#monolingual
#cross-lingual
#contextualized
#unsupervised
#BERT
#bias
#word-sense
#distillation
#overview
#mid-resource
#rare-word
#domain-adaptation
论文标题:A Monolingual Approach to Contextualized Word Embeddings for Mid-Resource Languages
关键词:#monolingual #contextualized #mid-resource #cross-lingual
论文标题:Analysing Lexical Semantic Change with Contextualised Word Representations
关键词:#monolingual #contextualized #word-sense
论文链接:https://arxiv.org/abs/2004.14118
论文标题:Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics
关键词:#monolingual #contextualized
论文链接:https://arxiv.org/abs/2005.02991
Functional Distributional Semantics provides a linguistically interpretable framework for distributional semantics, by representing the meaning of a word as a function (a binary classifier), instead of a vector.
论文标题:BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model Performance
关键词:#monolingual #contextualized #rare-word #BERT
论文链接:https://arxiv.org/abs/1910.07181
论文标题:CamemBERT: a Tasty French Language Model
关键词:#monolingual #contextualized #BERT
论文链接:https://arxiv.org/abs/1911.03894
论文标题:Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks
关键词:#monolingual #contextualized #BERT #domain-adaptation
论文链接:https://arxiv.org/abs/2004.10964
论文标题:Fast and Accurate Deep Bidirectional Language Representations for Unsupervised Learning
关键词:#monolingual #contextualized #BERT #unsupervised
论文链接:https://arxiv.org/abs/2004.08097
论文标题:FastBERT: a Self-distilling BERT with Adaptive Inference Time
关键词:#monolingual #contextualized #BERT #distillation
论文链接:https://arxiv.org/pdf/2004.02178.pdf
论文标题:Improving Transformer Models by Reordering their Sublayers
关键词:#monolingual #contextualized
论文链接:https://arxiv.org/abs/1911.03864
论文标题:Interpreting Pretrained Contextualized Representations via Reductions to Static Embeddings
关键词:#monolingual #contextualized
论文标题:nvestigating Word-Class Distributions in Word Vector Spaces
论文标题:MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices
关键词:#monolingual #contextualized #BERT #distillation
论文链接:https://arxiv.org/abs/2004.02984
论文标题:schuBERT: Optimizing Elements of BERT
关键词:#monolingual #contextualized #BERT
论文链接:https://arxiv.org/abs/2005.06628
论文标题:SenseBERT: Driving Some Sense into BERT
关键词:#monolingual #contextualized #BERT #word-sense
论文链接:https://arxiv.org/abs/1908.05646
论文标题:Spying on your neighbors: Fine-grained probing of contextual embeddings for information about surrounding words
关键词:#monolingual #contextualized #BERT
论文链接:https://arxiv.org/abs/2005.01810
论文标题:Double-Hard Debias: Tailoring Word Embeddings for Gender Bias Mitigation
关键词:#monolingual #Bias
论文链接:https://arxiv.org/abs/2005.00965
论文标题:Gender Bias in Multilingual Embeddings and Cross-Lingual Transfer
关键词:#monolingual #bias #BERT
论文链接:https://arxiv.org/abs/2005.00699
We present a neural framework for learning associations between interrelated groups of words such as the ones found in Subject-Verb-Object (SVO) structures.
论文标题:What are the Goals of Distributional Semantics?
关键词:#monolingual
论文链接:https://arxiv.org/abs/2005.02982
论文标题:When do Word Embeddings Accurately Reflect Surveys on our Beliefs About People?
关键词:#monolingual #bias
论文链接:https://arxiv.org/abs/2004.12043
论文标题:CluBERT: A Cluster-Based Approach for Learning Sense Distributions in Multiple Languages
关键词:#cross-lingual #contextualized #BERT #word-sense
论文链接:https://www.researchgate.net/publication/341151563_CluBERT_A_Cluster-Based_Approach_for_Learning_Sense_Distributions_in_Multiple_Languages
论文标题:Emerging Cross-lingual Structure in Pretrained Language Models
关键词:#cross-lingual #contextualized #BERT
论文链接:https://arxiv.org/abs/1911.01464
论文标题:Finding Universal Grammatical Relations in Multilingual BERT
关键词:#cross-lingual #contextualized #BERT
论文链接:https://arxiv.org/pdf/2005.04511.pdf
论文标题:On the Cross-lingual Transferability of Monolingual Representations
关键词:#cross-lingual #contextualized #BERT #unsupervised
论文链接:https://arxiv.org/abs/1910.11856
论文标题:Perturbed Masking: Parameter-free Probing for Analyzing and Interpreting BERT
关键词:#cross-lingual #contextualized #BERT
论文链接:https://arxiv.org/abs/2004.14786
论文标题:Similarity Analysis of Contextual Word Representation Models
关键词:#cross-lingual #contextualized
论文链接:https://arxiv.org/abs/2005.01172
论文标题:Unsupervised Cross-lingual Representation Learning at Scale
关键词:#cross-lingual #contextualized #unsupervised
论文链接:https://arxiv.org/abs/1911.02116
论文标题:Unsupervised Domain Clusters in Pretrained Language Models
关键词:#cross-lingual #contextualized #BERT #word-sense
论文链接:https://arxiv.org/abs/2004.02105
论文标题:XtremeDistil: Multi-stage Distillation for Massive Multilingual Models
关键词:#cross-lingual #contextualized #BERT #distillation
论文链接:https://arxiv.org/abs/2004.05686
论文标题:A Call for More Rigor in Unsupervised Cross-lingual Learning
关键词:#cross-lingual #unsupervised #overview
论文链接:https://arxiv.org/abs/2004.14958
论文标题:Revisiting the Context Window for Cross-lingual Word Embeddings
关键词:#cross-lingual #unsupervised
论文链接:https://arxiv.org/abs/2004.10813
论文标题:Should All Cross-Lingual Embeddings Speak English?
关键词:#cross-lingual
论文链接:https://arxiv.org/abs/1911.03058
[1] A Comprehensive Analysis of Preprocessing for Word Representation Learning in Affective Tasks
[2] A Graph-based Coarse-to-fine Method for Unsupervised Bilingual Lexicon Induction
Adaptive Compression of Word Embeddings
[3] Connecting Embeddings for Knowledge Graph Entity Typing
更多阅读
#投 稿 通 道#
让你的论文被更多人看到
如何才能让更多的优质内容以更短路径到达读者群体,缩短读者寻找优质内容的成本呢?答案就是:你不认识的人。
总有一些你不认识的人,知道你想知道的东西。PaperWeekly 或许可以成为一座桥梁,促使不同背景、不同方向的学者和学术灵感相互碰撞,迸发出更多的可能性。
PaperWeekly 鼓励高校实验室或个人,在我们的平台上分享各类优质内容,可以是最新论文解读,也可以是学习心得或技术干货。我们的目的只有一个,让知识真正流动起来。
📝 来稿标准:
• 稿件确系个人原创作品,来稿需注明作者个人信息(姓名+学校/工作单位+学历/职位+研究方向)
• 如果文章并非首发,请在投稿时提醒并附上所有已发布链接
• PaperWeekly 默认每篇文章都是首发,均会添加“原创”标志
📬 投稿邮箱:
• 投稿邮箱:hr@paperweekly.site
• 所有文章配图,请单独在附件中发送
• 请留下即时联系方式(微信或手机),以便我们在编辑发布时和作者沟通
🔍
现在,在「知乎」也能找到我们了
进入知乎首页搜索「PaperWeekly」
点击「关注」订阅我们的专栏吧
关于PaperWeekly
PaperWeekly 是一个推荐、解读、讨论、报道人工智能前沿论文成果的学术平台。如果你研究或从事 AI 领域,欢迎在公众号后台点击「交流群」,小助手将把你带入 PaperWeekly 的交流群里。