In this work, we study the problem of distributed mean estimation with $1$-bit communication constraints when the variance is unknown. We focus on the specific case where each user has access to one i.i.d. sample drawn from a distribution that belongs to a scale-location family, and is limited to sending just a single bit of information to a central server whose goal is to estimate the mean. We propose simple non-adaptive and adaptive protocols that are shown to be asymptotically normal. We derive bounds on the asymptotic (in the number of users) Mean Squared Error (MSE) achieved by these protocols. For a class of symmetric log-concave distributions, we derive matching lower bounds for the MSE achieved by adaptive protocols, proving the optimality of our scheme. Furthermore, we develop a lower bound on the MSE for non-adaptive protocols that applies to any symmetric strictly log-concave distribution by means of a refined squared Hellinger distance analysis. Through this, we show that for many common distributions including a subclass of the generalized Gaussian family, the asymptotic minimax MSE achieved by the best non-adaptive protocol is higher than that achieved by our simple adaptive protocol. Our simulation results confirm a positive gap between the adaptive and non-adaptive settings, aligning with the theoretical bounds.
翻译:暂无翻译