Abstract: The mean of an unknown variance-σ2 distribution f can be estimated from n samples with variance σ2n and nearly corresponding subgaussian rate. When f is known up to translation, this can be improved asymptotically to 1nI, where I is the Fisher information of the distribution. Such an improvement is not possible for general unknown f, but [Stone, 1975] showed that this asymptotic convergence is possible if f is symmetric about its mean. Stone's bound is asymptotic, however: the n required for convergence depends in an unspecified way on the distribution f and failure probability δ. In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every f,n,δ with n>log1δ, we get convergence close to a subgaussian with variance 1nIr, where Ir is the r-smoothed Fisher information with smoothing radius r that decays polynomially in n. Such a bound essentially matches the finite-sample guarantees in the known-f setting.
0 Replies
Loading