Abstract: Robust forecast aggregation combines the predictions of multiple information sources to
perform well in the worst case across all possible information structures. Previous work largely
focuses on settings with a known binary state space, where the state is either 0 or 1. We
study prior-agnostic robust forecast aggregation in which the aggregator observes only experts’
reports, yet is ignorant of both the underlying joint information structure and the full prior,
including the underlying state space. Unlike the standard model that fixes the binary state
space {0, 1}, we allow the (binary) unknown state values to be arbitrary numbers in [0, 1], so
the same reported probability may correspond to very different realized outcome frequencies
across environments.
Our main contribution is a simple, explicit, closed-form log-odds aggregator that linearly
pools forecasts in logit space, together with (nearly-)tight minimax-regret guarantees across
three knowledge regimes. We first show that under conditionally independent (CI) signals,
robust aggregation with an unknown state space is strictly harder than in the known-state
setting by establishing a larger lower bound, and our aggregation rule can achieve worst-case
regret of 0.0255. Along the way, we also characterize tight regret bounds for Blackwell-ordered
structures and for general information structures. In the classical setting with known state space
{0, 1}, our aggregator achieves regret strictly below 0.0226 for CI structures. To the best of our
knowledge, this is the first explicit closed-form aggregator that achieves a regret upper bound
strictly less than 0.0226. Finally, we extend the model to allow the aggregator to additionally
know each expert’s marginal forecast distribution; in this setting, with the CI structures, we show
that a generalized log-odds rule achieves regret of 0.0228, and we complement this guarantee
with a lower bound of 0.0225.
Loading