Model Successor Functions

Published: 2025, Last Modified: 28 Jan 2026CoRR 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The notion of generalization has moved away from the classical one defined in statistical learning theory towards an emphasis on out-of-domain generalization (OODG). There has been a growing focus on generalization from easy to hard, where a progression of difficulty implicitly governs the direction of domain shifts. This emerging regime has appeared in the literature under different names, such as length/logical/algorithmic extrapolation, but a formal definition is lacking. We argue that the unifying theme is induction -- based on finite samples observed in training, a learner should infer an inductive principle that applies in an unbounded manner. This work formalizes the notion of inductive generalization along a difficulty progression and argues that our path ahead lies in transforming the learning paradigm. We attempt to make inroads by proposing a novel learning paradigm, Inductive Learning, which involves a central concept called model successors. We outline practical steps to adapt well-established techniques towards learning model successors. This work calls for restructuring of the research discussion around induction and generalization from fragmented task-centric communities to a more unified effort, focused on universal properties of learning and computation.
Loading