Multi-agent reinforcement learning (MARL) suffers from the non-stationarity problem, which is the ever-changing targets at every iteration when multiple agents update their policies at the same time. Starting from first principle, in this paper, we manage to solve the non-stationarity problem by proposing bidirectional action-dependent Q-learning (ACE). Central to the development of ACE is the sequential decision-making process wherein only one agent is allowed to take action at one time. Within this process, each agent maximizes its value function given the actions taken by the preceding agents at the inference stage. In the learning phase, each agent minimizes the TD error that is dependent on how the subsequent agents have reacted to their chosen action. Given the design of bidirectional dependency, ACE effectively turns a multiagent MDP into a single-agent MDP. We implement the ACE framework by identifying the proper network representation to formulate the action dependency, so that the sequential decision process is computed implicitly in one forward pass. To validate ACE, we compare it with strong baselines on two MARL benchmarks. Empirical experiments demonstrate that ACE outperforms the state-of-the-art algorithms on Google Research Football and StarCraft Multi-Agent Challenge by a large margin. In particular, on SMAC tasks, ACE achieves 100% success rate on almost all the hard and super-hard maps. We further study extensive research problems regarding ACE, including extension, generalization, and practicability. Code is made available to facilitate further research.
翻译:多剂加固学习(MARL)受到非常态问题的影响,这是在多个代理商同时更新其政策时每个迭代周期中不断变化的目标。从第一原则开始,我们在本文件中通过提出双向行动依赖的Q学习(ACE)来解决非常态问题。对于ACE的发展来说,关键的是顺序决策过程,其中只允许一个代理商同时采取行动。在这一过程中,由于前面代理商在推断阶段采取的行动,每个代理商最大限度地发挥其价值功能。在学习阶段,每个代理商最大限度地减少TD错误,这取决于随后的代理商如何对其所选择的行动作出反应。鉴于双向依赖性设计,ACE将多剂MDP有效地转化为单一代理商的MDP。我们实施ACE框架,方法是确定适当的网络代表来制定行动依赖性,从而在一次前暗中计算顺序决策程序。为了验证ACE,我们将其与两个MARL基准的强基线进行了比较。关于ACEEO-CE的磁性分析实验显示,包括CEFIFLA和MLAFI-S-restial-ressloral-ass recal exal exal exal laction acaltical ex exal ex and ex ex ex ex ex exmlationaltraceal ex exal ex