Beyond Turing: Topological Closure as a Foundation for Cognitive Computation

ICLR 2026 Conference Submission13565 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Cognitive computation; topological closure; cycle formation; memory-amortized inference; order invariance
Abstract: Classical models of computation, epitomized by the Turing machine, are grounded in \emph{enumeration}: syntactic manipulation of discrete symbols according to formal rules. While powerful, such systems are intrinsically vulnerable to Gödelian incompleteness and Turing undecidability, since truth and meaning are sought through potentially endless symbolic rewriting. We propose an alternative foundation for non-enumerative computation based on \emph{topological closure} of semantic structures. In this view, cognition operates by promoting transient fragments into closed cycles, where $\partial^2=0$ ensures that only invariants persist. This shift reframes computation from \emph{syntax} to \emph{structure}: memory and reasoning arise not by enumerating all possibilities, but by stabilizing relational invariants that survive perturbations and generalize across contexts. We formalize this principle through the dot–cycle dichotomy: dots or trivial cycles ($H_0$) serve as high-entropy scaffolds for exploration, while nontrivial cycles ($H_1$ and higher) encode low-entropy invariants that persist as memory. Extending this perspective, we show how Memory-Amortized Inference (MAI) implements an anti-enumerative principle by storing homological equivalence classes rather than symbolic traces, yielding robust generalization, energy efficiency, and structural completeness beyond Turing-style models. We conclude that \emph{topological closure} provides a unifying framework for perception, memory, and action, and a candidate foundation for cognitive computation that transcends the limits of enumeration.
Primary Area: learning theory
Submission Number: 13565
Loading