Making predictions robust is an important challenge. A separate challenge in federated learning (FL) is to reduce the number of communication rounds, particularly since doing so reduces performance in heterogeneous data settings. To tackle both issues, we take a Bayesian perspective on the problem of learning a global model. We show how the global predictive posterior can be approximated using client predictive posteriors. This is unlike other works which aggregate the local model space posteriors into the global model space posterior, and are susceptible to high approximation errors due to the posterior's high dimensional multimodal nature. In contrast, our method performs the aggregation on the predictive posteriors, which are typically easier to approximate owing to the low-dimensionality of the output space. We present an algorithm based on this idea, which performs MCMC sampling at each client to obtain an estimate of the local posterior, and then aggregates these in one round to obtain a global ensemble model. Through empirical evaluation on several classification and regression tasks, we show that despite using one round of communication, the method is competitive with other FL techniques, and outperforms them on heterogeneous settings. The code is publicly available at https://github.com/hasanmohsin/FedPredSpace_1Round.
翻译:使预测更加稳健是一项重要挑战。 联合学习(FL)的一个单独挑战是减少通信周期的数量,特别是因为这样做会降低不同数据设置的性能。 为了解决这两个问题,我们从拜叶西亚的角度看待学习全球模型的问题。 我们展示了如何使用客户预测子星座来比较全球预测的远地点。 这不同于将本地模型空间后遗物汇总成全球模型空间后座体的其他工作,并且由于后方的高维多式联运性质,很容易发生高近似差错。 相比之下,我们的方法在预测外星体上进行汇总,由于产出空间的低维度,这种组合通常比较容易估计。 我们展示了基于这一理念的算法,对每个客户进行MC MMC 取样,以获得对本地远地点的远地点的估计,然后将这些结果汇总到一个回合中以获得一个全球主题模型。 通过对若干分类和回归任务进行实证评估,我们发现,尽管使用了一轮通信,但该方法与其他FL技术具有竞争力,并且通常比较容易被近似。 http://primamas 的版本/presmalogis