Trajectory-Based Neural Darwinism in Convolutional Neural Networks: Variation, Competition, and Selective Retention
Abstract: Understanding how neural networks develop and stabilize internal representations remains a central challenge. Inspired by Edelman’s Neural Darwinism, we introduce the Neuron Darwinian Dynamics System (NDDS), a trajectory-based framework that treats neurons as evolving agents under both local and global selective pressures. We define the Global Darwinian Pressure (GDP) as the population-average neuron fitness, capturing system-wide selection dynamics. Layer-wise analyses show that selective pressure intensifies over training, particularly in deeper layers, reflecting progressive consolidation of high-fitness neurons. Ablation experiments further reveal that removing survived neurons leads to substantial accuracy loss, whereas eliminating low-fitness neurons causes minimal degradation, demonstrating NDDS’s ability to identify functionally critical units. Dynamic trajectory analyses show that survived neurons maintain coherent activity, stronger weights, and higher global Darwinian pressures, while eliminated neurons stagnate. Overall, our results support a Darwinian view of representation learning: networks achieve early-stage redundancy and later-stage specialization, enabling robust and stable task-relevant representations.
Submission Type: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Stefano_Sarao_Mannelli1
Submission Number: 7449
Loading