Cross-device Federated Learning (FL) faces significant challenges where low-end clients that could potentially make unique contributions are excluded from training large models due to their resource bottlenecks. Recent research efforts have focused on model-heterogeneous FL, by extracting reduced-size models from the global model and applying them to local clients accordingly. Despite the empirical success, general theoretical guarantees of convergence on this method remain an open question. In this paper, we present a unifying framework for heterogeneous FL algorithms with online model extraction and provide a general convergence analysis. In particular, we prove that under certain sufficient conditions and for both IID and non-IID data, these algorithms converge to a stationary point of standard FL for general smooth cost functions. Moreover, we illuminate two key factors impacting its convergence: model-extraction noise and minimum coverage index, advocating a joint design of local model extraction for efficient heterogeneous FL.
翻译:暂无翻译