Abstract: We obtain a lower bound for an algorithm predicting finite-dimensional distributions (i.e., points from a simplex) under Kullback-Leibler loss. The bound holds w.r.t. the class of softmax linear predictors. We then show that the bound is asymptotically matched by the Bayesian universal algorithm.
0 Replies
Loading