In this paper we investigate the limit performance of Floating Gossip, a new, fully distributed Gossip Learning scheme which relies on Floating Content to implement location-based probabilistic evolution of machine learning models in an infrastructure-less manner. We consider dynamic scenarios where continuous learning is necessary, and we adopt a mean field approach to investigate the limit performance of Floating Gossip in terms of amount of data that users can incorporate into their models, as a function of the main system parameters. Different from existing approaches in which either communication or computing aspects of Gossip Learning are analyzed and optimized, our approach accounts for the compound impact of both aspects. We validate our results through detailed simulations, proving good accuracy. Our model shows that Floating Gossip can be very effective in implementing continuous training and update of machine learning models in a cooperative manner, based on opportunistic exchanges among moving users.
翻译:在本文中,我们调查了浮动Gossip的有限性能,这是一个完全分布的新的“Gossip”学习计划,它依靠浮动内容,以无基础设施的方式实施机器学习模式的基于位置的概率演进。我们考虑了需要继续学习的动态情景,我们采取了一种中性实地方法,调查浮动Gossip在用户可以纳入模型的数据数量方面的有限性能,这是主要系统参数的函数。不同于分析和优化Gossip学习的通信或计算方面的现有方法,我们的方法说明了这两个方面的复合影响。我们通过详细的模拟来验证我们的结果,证明了良好的准确性。我们的模型表明,浮动Gosip能够非常有效地以合作的方式,在移动用户之间的机会交流的基础上,以合作的方式实施持续培训和更新机器学习模式。