Federated learning methods, that is, methods that perform model training using data situated across different sources, whilst simultaneously not having the data leave their original source, are of increasing interest in a number of fields. However, despite this interest, the classes of models for which easily-applicable and sufficiently general approaches are available is limited, excluding many structured probabilistic models. We present a general yet elegant resolution to the aforementioned issue. The approach is based on adopting structured variational inference, an approach widely used in Bayesian machine learning, to the federated setting. Additionally, a communication-efficient variant analogous to the canonical FedAvg algorithm is explored. The effectiveness of the proposed algorithms are demonstrated, and their performance is compared on Bayesian multinomial regression, topic modelling, and mixed model examples.
翻译:联邦学习方法,即使用来自不同来源的数据进行示范培训的方法,虽然数据没有离开原始来源,但在许多领域越来越感兴趣,然而,尽管有这种兴趣,但容易适用和足够一般的方法的模型种类有限,排除了许多结构化的概率模型。我们对上述问题提出了一个普遍但优雅的解决办法。这一办法的基础是采用结构化的变异推论,一种在巴伊西亚机器学习中广泛使用的方法,即结合式环境。此外,还探讨了类似于卡通式FedAvg算法的通信效率变异法,展示了拟议算法的有效性,并将其性能与巴伊西亚多重回归、专题建模和混合模型实例进行比较。