Federated learning provides the ability to learn over heterogeneous user data in a distributed manner, while preserving user privacy. However, its current clients selection technique is a source of bias as it discriminates against slow clients. For starters, it selects clients that satisfy certain network and system specific criteria, thus not selecting slow clients. Even when such clients are included in the training process, they either straggle the training or are altogether dropped from the round for being too slow. Our proposed idea looks to find a sweet spot between fast convergence and heterogeneity by looking at smart clients selection and scheduling techniques.
翻译:联邦学习能够以分布方式了解不同用户的数据,同时保护用户隐私。然而,其目前的客户选择技术是偏见的来源,因为它歧视慢客户。首先,它选择了符合特定网络和系统特定标准的客户,因此没有选择慢客户。即使这些客户被纳入培训过程,它们也可能因为太慢而中断培训,或者被完全从回合中剔除。我们提出的想法是寻找聪明客户选择和排期技巧,从而找到快速趋同和异质之间的甜点。