Evolve to Adapt, Not Guess: A Gradient-Free and Robust Framework for Layer-Wise Fine-Tuning via Evolutionary Learning Rate Optimization

18 Sept 2025 (modified: 14 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Deep Neural Networks, Evolutionary optimization, Fine-tuning, layer-wise optimization
TL;DR: REVO-Tune: An evolutionary algorithm that automatically optimizes layer-specific learning rates during neural network fine-tuning, outperforming traditional methods especially in low-data scenarios
Abstract: Fine-tuning pretrained neural networks for domain adaptation requires careful adjustment of layer-specific learning rates, yet existing strategies often rely on manual heuristics or global schedules that fail to capture diverse adaptation patterns. This challenge is amplified in data-scarce settings, where gradient-based hyperparameter optimization suffers from high variance. To this end, we present REVO-Tune, which systematically employs evolutionary optimization to discover optimal layer-specific learning rates during fine-tuning neural networks automatically. Our approach introduces two encoding strategies: binary representation that selectively adapts layers with a shared global rate for computational efficiency, and continuous representation that assigns per-layer learning rates for fine-grained control. Both strategies use gradient-free population-based search to explore optimal configurations. Across diverse datasets and architectures, REVO-Tune consistently improves fine-tuning performance, yielding 2-4\% higher accuracy and 1-3\% higher AUC than standard fine-tuning approaches. The continuous encoding excels in performance-critical scenarios, while binary encoding offers substantial efficiency-accuracy trade-offs. Our empirical analysis demonstrates that evolutionary optimization can effectively complement modern adaptive optimizers, providing practical improvements for automated fine-tuning in resource-constrained environments where manual hyperparameter tuning is impractical. Code is provided as supplementary material.
Supplementary Material: zip
Primary Area: optimization
Submission Number: 10225
Loading