We develop the theory and algorithmic toolbox for networked federated learning in decentralized collections of local datasets with an intrinsic network structure. This network structure arises from domain-specific notions of similarity between local datasets. Different notions of similarity are induced by spatio-temporal proximity, statistical dependencies or functional relations. Our main conceptual contribution is to formulate networked federated learning using a generalized total variation minimization. This formulation unifies and considerably extends existing federated multi-task learning methods. It is highly flexible and can be combined with a broad range of parametric models including Lasso or deep neural networks. Our main algorithmic contribution is a novel networked federated learning algorithm which is well-suited for distributed computing environments such as edge computing over wireless networks. This algorithm is robust against inexact computations due to limited computational resources. For local models resulting in convex problems, we derive precise conditions on the local models and their network structure such that our algorithm learns nearly optimal local models. Our analysis reveals an interesting interplay between the convex geometry of local models and the (cluster-) geometry of their network structure.
翻译:我们开发了网络化联合学习的理论和算法工具箱,用于在分散收集具有内在网络结构的地方数据集中进行网络化联合学习。这一网络结构源于地方数据集之间的相似性的域化概念。不同的相似性概念是由时空接近、统计依赖性或功能关系所引发的。我们的主要概念贡献是利用普遍的全面变异最小化来开发网络化联合学习。这一构思统一并大大扩展了现有的联合多任务学习方法。它非常灵活,可以与广泛的参数模型相结合,包括激光索或深神经网络。我们的主要算法贡献是一种新型的网络化联合学习算法,它非常适合分布式计算环境,例如超无线网络的边际计算。这种算法对于由于计算资源有限而导致的不精确计算是十分有力的。对于本地模型及其网络结构的精确条件,使我们的算法能够学习近乎最佳的本地模型。我们的分析揭示了本地模型和网络结构(分组)的直方对等仪和(集群)几何结构之间的令人感兴趣的相互作用。