Recently, blockchain-based federated learning (BFL) has attracted intensive research attention due to that the training process is auditable and the architecture is serverless avoiding the single point failure of the parameter server in vanilla federated learning (VFL). Nevertheless, BFL tremendously escalates the communication traffic volume because all local model updates (i.e., changes of model parameters) obtained by BFL clients will be transmitted to all miners for verification and to all clients for aggregation. In contrast, the parameter server and clients in VFL only retain aggregated model updates. Consequently, the huge communication traffic in BFL will inevitably impair the training efficiency and hinder the deployment of BFL in reality. To improve the practicality of BFL, we are among the first to propose a fast blockchain-based communication-efficient federated learning framework by compressing communications in BFL, called BCFL. Meanwhile, we derive the convergence rate of BCFL with non-convex loss. To maximize the final model accuracy, we further formulate the problem to minimize the training loss of the convergence rate subject to a limited training time with respect to the compression rate and the block generation rate, which is a bi-convex optimization problem and can be efficiently solved. To the end, to demonstrate the efficiency of BCFL, we carry out extensive experiments with standard CIFAR-10 and FEMNIST datasets. Our experimental results not only verify the correctness of our analysis, but also manifest that BCFL can remarkably reduce the communication traffic by 95-98% or shorten the training time by 90-95% compared with BFL.
翻译:最近,基于链路的联谊学习(BFL)吸引了密集的研究关注,因为培训过程是可审计的,而建筑架构又没有服务器,避免了香草联谊学习(VFL)中参数服务器的单一点故障。然而,BFL大幅提升了通信流量,因为BFL客户获得的所有本地模式更新(即模型参数参数的改变)都将传送给所有矿工,供核查和所有客户汇总。相比之下,VFL的参数服务器和客户只保留汇总的模型更新。因此,BFL的庞大通信流量将不可避免地损害培训效率,阻碍BFLL在现实中的部署。为了提高BFL的实用性,我们是第一个通过简化BFLL的通信(即FL)快速链化通信效率(即模型参数的改变)联合学习框架。与此同时,我们将BCFLFL的合并率与非conx损失的合并率,我们进一步提出问题,最大限度地减少培训的合并率,条件是培训时间有限,但比LFLLL的压缩速度和BLF的升级率将比L的升级率降低。