Emergent communication research often focuses on optimizing task-specific utility as a driver for communication. However, human languages appear to evolve under pressure to efficiently compress meanings into communication signals by optimizing the Information Bottleneck tradeoff between informativeness and complexity. In this work, we study how trading off these three factors -- utility, informativeness, and complexity -- shapes emergent communication, including compared to human communication. To this end, we propose Vector-Quantized Variational Information Bottleneck (VQ-VIB), a method for training neural agents to compress inputs into discrete signals embedded in a continuous space. We train agents via VQ-VIB and compare their performance to previously proposed neural architectures in grounded environments and in a Lewis reference game. Across all neural architectures and settings, taking into account communicative informativeness benefits communication convergence rates, and penalizing communicative complexity leads to human-like lexicon sizes while maintaining high utility. Additionally, we find that VQ-VIB outperforms other discrete communication methods. This work demonstrates how fundamental principles that are believed to characterize human language evolution may inform emergent communication in artificial agents.
翻译:新兴通信研究往往侧重于优化特定任务作为通信驱动力的效用,然而,人类语言似乎在压力下演变,通过优化信息瓶颈和复杂度之间的平衡,将有效压缩含义转化为通信信号。在这项工作中,我们研究如何权衡这三种因素 -- -- 实用性、信息性和复杂性 -- -- 如何影响新兴通信,包括与人类通信相比的通信。为此,我们提议采用矢量量化信息瓶颈(VQ-VIB)这一方法,用于培训神经剂,将输入压缩成嵌入连续空间的离散信号。我们通过VQ-VIB培训代理,将其性能与先前提议的在基础环境和自定义参考游戏中的神经结构进行对比。在所有神经结构和环境中,考虑到通信互通性有利于通信汇合率,并对通信复杂性进行处罚,既能保持高功用性,又能发现VQ-VIB优于其他离散通信方法。这项工作表明,相信将人文演变为人文代理人的特征的基本原理如何显示人文演变。