Fully decentralised federated learning enables collaborative training of individual machine learning models on distributed devices on a communication network while keeping the training data localised. This approach enhances data privacy and eliminates both the single point of failure and the necessity for central coordination. Our research highlights that the effectiveness of decentralised federated learning is significantly influenced by the network topology of connected devices. We propose a strategy for uncoordinated initialisation of the artificial neural networks, which leverages the distribution of eigenvector centralities of the nodes of the underlying communication network, leading to a radically improved training efficiency. Additionally, our study explores the scaling behaviour and choice of environmental parameters under our proposed initialisation strategy. This work paves the way for more efficient and scalable artificial neural network training in a distributed and uncoordinated environment, offering a deeper understanding of the intertwining roles of network structure and learning dynamics.
翻译:暂无翻译