Peirce in the Machine: How Mixture of Experts Models Perform Hypothesis Construction

Published: 03 Sept 2025, Last Modified: 01 Apr 2026Philosophy of ScienceEveryoneCC BY 4.0
Abstract: Mixture of experts is a prediction aggregation method in machine learning that aggregates the predictions of specialized experts. This method often outperforms Bayesian methods despite the Bayesian having stronger inductive guarantees. We argue that this is due to the greater functional capacity of mixture of experts. We prove that in a limiting case, mixture of experts will have greater capacity than equivalent Bayesian methods, which we vouchsafe through experiments on non-limiting cases. Finally, we conclude that mixture of experts is a type of abductive reasoning in the Peircean sense of hypothesis construction.
Loading