In this paper, we consider the federated learning (FL) problem in the presence of communication errors. We model the link between the devices and the central node (CN) by a packet erasure channel, where the local parameters from devices are either erased or received correctly by CN with probability $e$ and $1-e$, respectively. We provide mathematical proof for the convergence of the FL algorithm in the presence of communication errors, where the CN uses past local updates when the fresh updates are not received from some devices. We show via simulations that by using the past local updates, the FL algorithm can converge in the presence of communication errors. We also show that when the dataset is uniformly distributed among devices, the FL algorithm that only uses fresh updates and discards missing updates might converge faster than the FL algorithm that uses past local updates.
翻译:在本文中,我们在出现通信错误时考虑联结学习(FL)问题。 我们用一个封包删除频道来模拟设备与中央节点(CN)之间的联系, 设备产生的本地参数被CN删除或正确接收, 概率分别为$美元和$- e美元。 我们提供数学证明, 显示在通信错误的情况下, FL 算法会趋同, 因为通信错误, CN 在收到某些设备的最新更新时会使用过去的地方更新。 我们通过模拟来显示, 通过使用过去本地更新, FL 算法可以在出现通信错误时汇合。 我们还显示, 当数据集在设备之间统一分布时, 只使用更新和丢弃缺失更新的FL 算法可能会比使用过去本地更新的FL 算法更快。