Searching Latent Program Spaces

Published: 14 Jun 2025, Last Modified: 19 Jul 2025ICML 2025 Workshop PRAL OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Test-Time Compute, Latent Program Search, Deep Learning, Meta-Learning
TL;DR: We introduce the Latent Program Network (LPN), a new neural architecture that builds test-time search directly inside its inference, by learning and searching through a latent space of implicit programs before generating outputs.
Track: Long Paper (up to 9 pages)
Abstract: General intelligence requires systems that acquire new skills efficiently and generalize beyond their training distributions. Although program synthesis approaches have strong generalization power, they face scaling issues due to large combinatorial spaces that quickly make them impractical and require human-generated DSLs or pre-trained priors to narrow this search space. On the other hand, deep learning methods have had high successes, but they lack structured test-time adaptation and rely on heavy stochastic sampling or expensive gradient updates for fine-tuning. In this work, we propose the Latent Program Network (LPN), a new architecture that builds in test-time search directly into neural models. LPN learns a latent space of implicit programs---neurally mapping inputs to outputs---through which it can search using gradients at test time. LPN combines the adaptability of symbolic approaches and the scalability of neural methods. It searches through a compact latent space at test time and bypasses the need for pre-defined domain-specific languages. On a range of programming-by-examples tasks, LPN either outperforms or matches performance compared to in-context learning and test-time training methods. Tested on the ARC-AGI benchmark, we demonstrate that LPN can both learn a compact program space and search through it at test time to adapt to novel tasks. LPN doubles its performance on out-of-distribution tasks when test-time search is switched on.
Format: We have read the camera-ready instructions, and our paper is formatted with the provided template.
Supplementary Material: zip
De-Anonymization: This submission has been de-anonymized.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 20
Loading