Vertical distributed learning exploits the local features collected by multiple learning workers to form a better global model. However, the exchange of data between the workers and the model aggregator for parameter training incurs a heavy communication burden, especially when the learning system is built upon capacity-constrained wireless networks. In this paper, we propose a novel hierarchical distributed learning framework, where each worker separately learns a low-dimensional embedding of their local observed data. Then, they perform communication efficient distributed max-pooling for efficiently transmitting the synthesized input to the aggregator. For data exchange over a shared wireless channel, we propose an opportunistic carrier sensing-based protocol to implement the max-pooling operation for the output data from all the learning workers. Our simulation experiments show that the proposed learning framework is able to achieve almost the same model accuracy as the learning model using the concatenation of all the raw outputs from the learning workers, while requiring a communication load that is independent of the number of workers.
翻译:纵向分布式学习利用多学习工作者收集的本地特征形成更好的全球模型。然而,工人与参数培训模型聚合器之间的数据交换产生了沉重的通信负担,特别是当学习系统建立在能力受限制的无线网络上时。在本文中,我们提出一个新的分级分布式学习框架,让每个工人分别学习当地观测到的数据的低维嵌入。然后,他们进行高效的分级共享,以便有效地向聚合器传输综合输入。对于在共享无线频道上的数据交换,我们提议一个机会式载体感测协议,以实施所有学习工作者产出数据的最大共享操作。我们的模拟实验表明,拟议的学习框架能够使用学习工作者所有原始产出的组合,实现几乎与学习模型几乎相同的模型准确性,同时需要与工人人数无关的通信负荷。