The design of decentralized learning algorithms is important in the fast-growing world in which data are distributed over participants with limited local computation resources and communication. In this direction, we propose an online algorithm minimizing non-convex loss functions aggregated from individual data/models distributed over a network. We provide the theoretical performance guarantee of our algorithm and demonstrate its utility on a real life smart building.
翻译:分散化学习算法的设计在快速增长的世界中非常重要,在这种快速增长的世界中,数据分布于当地计算资源和通信有限的参与者。 在这方面,我们建议采用在线算法,最大限度地减少从在网络上分布的单个数据/模型中累积的非混凝土损失功能。 我们为我们的算法提供理论性能保障,并展示它在现实生活中智能建筑上的实用性。