Automated Architecture Synthesis for Arbitrarily Structured Neural Networks

ICLR 2026 Conference Submission23443 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural Networks Structure
Abstract: This paper proposes a novel perspective on the architecture of Artificial Neural Networks (ANNs). Conventional ANNs often adopt predefined tree-like or Directed Acyclic Graph (DAG) structures for simplicity; however, these structures limit network collaboration and capability due to the lack of horizontal and backward communication. In contrast, biological neural systems consist of billions of neural units with highly complex connections, enabling each neuron to establish connections with others according to specific contexts. Inspired by biological neural systems, this study presents a new framework that automatically learns to construct arbitrary graph structures during training. It also introduces the concept of "Neural Modules" to organize neural units, which facilitates communication between any nodes and collaboration across modules. Unlike traditional ANNs that rely on DAGs, the proposed framework evolves from complete graphs, allowing free communication between neurons---mimicking the behavior of biological neural networks. Furthermore, we develop a method to compute these arbitrary graph structures and a regularization technique to organize them into multiple independent, balanced Neural Modules. This approach reduces overfitting and enhances efficiency via parallel computing. Overall, our method enables ANNs to learn effective arbitrary structures analogous to biological neural systems. It exhibits adaptability to various tasks and compatibility across different scenarios, with experimental results verifying its potential.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 23443
Loading