One-Bit Distributed Mean Estimation with Unknown Variance

TMLR Paper6155 Authors

09 Oct 2025 (modified: 16 Jan 2026)Decision pending for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: In this work, we study the problem of distributed mean estimation with $1$-bit communication constraints when the variance is unknown. We focus on the setting where each user has access to one i.i.d. sample drawn from a distribution belonging to a \emph{location–scale family}, and is limited to sending just a single bit of information to a central server whose goal is to estimate the mean. We propose simple non-adaptive and adaptive protocols and show that both achieve asymptotic normality. We derive bounds on the asymptotic (in the number of users) Mean Squared Error (MSE) achieved by these protocols. For a class of symmetric log-concave distributions, we derive matching lower bounds for the MSE of adaptive protocols, establishing the optimality of our scheme. Furthermore, we develop a lower bound on the MSE for non-adaptive protocols that applies to any symmetric strictly log-concave distribution, using a refined squared Hellinger distance analysis. Through this, we show that for many common distributions, including a subclass of the generalized Gaussian family, the asymptotic minimax MSE achieved by the best non-adaptive protocol is strictly larger than that achieved by our simple adaptive protocol. We also demonstrate that increasing the number of bits per user can only marginally reduce the asymptotic MSE of adaptive protocols. Our simulation results confirm a positive gap between the adaptive and non-adaptive settings, aligning with the theoretical bounds.
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Yingbin_Liang1
Submission Number: 6155
Loading