AdaCubic: An Adaptive Cubic Regularization Optimizer for Deep Learning

TMLR Paper6482 Authors

12 Nov 2025 (modified: 06 Jan 2026)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: A novel regularization technique, AdaCubic, is proposed that adapts the weight of the cubic term. The heart of AdaCubic is an auxiliary optimization problem with cubic constraints that dynamically adjusts the weight of the cubic term in Newton’s cubic regularized method. We use Hutchinson’s method to approximate the Hessian matrix, thereby reducing computational cost. We demonstrate that AdaCubic inherits the cubically regularized Newton method’s local convergence guarantees. Our experiments in Computer Vision, Natural Language Processing, and Signal Processing tasks demonstrate that AdaCubic outperforms or competes with several widely used optimizers. Unlike other adaptive algorithms that require hyperparameter fine-tuning, AdaCubic is evaluated with a fixed set of hyperparameters, making it a highly attractive optimizer in settings where fine-tuning is infeasible. This makes AdaCubic an attractive option for researchers and practitioners alike. To our knowledge, AdaCubic is the first optimizer to leverage cubic regularization in scalable deep learning applications. The code of \textsc{AdaCubic} will be publicly released upon paper acceptance.
Submission Type: Long submission (more than 12 pages of main content)
Changes Since Last Submission: All changes have been highlighted in color in the revised manuscript.
Assigned Action Editor: ~Yi_Zhou2
Submission Number: 6482
Loading