Mean-field games (MFGs) are a modeling framework for systems with a large number of interacting agents. They have applications in economics, finance, and game theory. Normalizing flows (NFs) are a family of deep generative models that compute data likelihoods by using an invertible mapping, which is typically parameterized by using neural networks. They are useful for density modeling and data generation. While active research has been conducted on both models, few noted the relationship between the two. In this work, we unravel the connections between MFGs and NFs by contextualizing the training of an NF as solving the MFG. This is achieved by reformulating the MFG problem in terms of agent trajectories and parameterizing a discretization of the resulting MFG with flow architectures. With this connection, we explore two research directions. First, we employ expressive NF architectures to accurately solve high-dimensional MFGs, sidestepping the curse of dimensionality in traditional numerical methods. Compared with other deep learning approaches, our trajectory-based formulation encodes the continuity equation in the neural network, resulting in a better approximation of the population dynamics. Second, we regularize the training of NFs with transport costs and show the effectiveness on controlling the model's Lipschitz bound, resulting in better generalization performance. We demonstrate numerical results through comprehensive experiments on a variety of synthetic and real-life datasets.
翻译:光场游戏(MFGs)是具有大量互动剂的系统的模型框架。 它们具有经济学、金融学和游戏理论方面的应用。 正常流动(NFs)是一个深层次的基因模型组,它使用不可逆的映射来计算数据概率, 通常是使用神经网络参数的参数。 它们对于密度建模和数据生成都有用。 虽然对两种模型都进行了积极的研究, 但很少人注意到这两种模型之间的关系。 在这项工作中, 我们通过将NFF培训背景化为解决MFG的方法, 来打破MFGs和NFs之间的联系。 标准化流动流动(NFs)是一系列深层次的基因模型, 其实现方式是重新定义MFG问题, 使用不可逆的映射图来计算数据概率。 我们探索了两个研究方向。 首先, 我们使用直观的NFF结构来准确解决高维度MGs, 在传统数字方法中绕开我们维度的诅咒。 与其他深层次的学习方法相比, 我们的轨迹模型化模型化为MFMFCs 的第二模型化, 将结果的连续性方方方程式与Slimalimalalalalalalal commilling imact iming impactalalalalalalaldalationalalalalalalalalalal cuilvealdaldal cuildaldaldaldaldaldaldalds cumental cumentaldaldaldaldaldaldaldaldaldaldaldaldaldald cumentaldaldaldaldald cumentaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldal