Keywords: Emergence, training dynamics, pruning, landscape
Abstract: Emergence, where complex behaviors develop from the interactions of simpler components within a network, plays a crucial role in enhancing neural network capabilities. We introduce a quantitative framework to measure emergence as structural nonlinearity, study the dynamics of this measure during the training process, and examine its impact on network performance, particularly in relation to pruning and training dynamics. Our hypothesis posits that the degree of emergence—evaluated from the distribution and connectivity of active nodes—can predict the development of emergent behaviors in the network. We demonstrate that higher emergence correlates with improved trianing performance. We further explore the relationship between network complexity and the loss landscape, suggesting that higher emergence indicates a greater concentration of local minima and a more rugged loss landscape. We show that this framework can be applied to explain the impact of pruning on the training dynamics. These findings provide new insights into the interplay between emergence, complexity, and performance in neural networks, offering implications for designing and optimizing architectures.
Primary Area: learning theory
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5671
Loading