The explosion of conference paper submissions in AI and related fields, has underscored the need to improve many aspects of the peer review process, especially the matching of papers and reviewers. Recent work argues that the key to improve this matching is to modify aspects of the \emph{bidding phase} itself, to ensure that the set of bids over papers is balanced, and in particular to avoid \emph{orphan papers}, i.e., those papers that receive no bids. In an attempt to understand and mitigate this problem, we have developed a flexible bidding platform to test adaptations to the bidding process. Using this platform, we performed a field experiment during the bidding phase of a medium-size international workshop that compared two bidding methods. We further examined via controlled experiments on Amazon Mechanical Turk various factors that affect bidding, in particular the order in which papers are presented \cite{cabanac2013capitalizing,fiez2020super}; and information on paper demand \cite{meir2021market}. Our results suggest that several simple adaptations, that can be added to any existing platform, may significantly reduce the skew in bids, thereby improving the allocation for both reviewers and conference organizers.
翻译:AI和相关领域的会议文件提交激增,突出表明需要改进同行审议进程的许多方面,特别是文件和审评员的匹配。最近的工作认为,改进这一匹配的关键在于修改该匹配本身的方方面面,以确保对文件的一套投标是平衡的,特别是避免那些未收到投标的文件。为了理解和缓解这一问题,我们开发了一个灵活的投标平台,以测试对投标进程的调整。我们利用这一平台,在中规模国际讲习班的投标阶段进行了实地试验,比较了两种投标方法。我们进一步通过在亚马逊机械土耳其的受控实验,审查了影响投标的各种因素,特别是提交论文的顺序(cite{cabanac2013apitalizing,fiez2020suge});以及关于纸张需求的信息(cite{meir2021market})。我们的结果表明,可以将一些简单的调整添加到任何现有平台上,这样可以大大减少标价,从而改进对审评员和会议组织者的分配。</s>