Keywords: convex neural networks, deep learning, admm, alternating minimization, gpu acceleration
Abstract: We introduce the CRONOS algorithm for convex optimization of two-layer neural networks.
CRONOS is the first algorithm capable of scaling to high-dimensional datasets such as ImageNet, which are ubiquitous in modern deep learning.
This significantly improves upon prior work, which has been restricted to downsampled versions of MNIST and CIFAR-10.
Taking CRONOS as a primitive, we then develop a new algorithm called CRONOS-AM, which combines CRONOS with alternating minimization, to obtain an algorithm capable of training multi-layer networks with arbitrary architectures.
Our theoretical analysis proves that CRONOS converges to the global minimum of the convex reformulation under mild assumptions.
In addition, we validate the efficacy of CRONOS and CRONOS-AM through extensive large-scale numerical experiments with GPU acceleration in JAX.
Our results show that CRONOS-AM can obtain comparable or better validation accuracy than predominant tuned deep learning optimizers on vision and language tasks with benchmark datasets such as ImageNet and IMDb.
To the best of our knowledge, CRONOS is the first algorithm which utilizes the convex reformulation to enhance performance on large-scale learning tasks.
Supplementary Material: zip
Primary Area: Optimization (convex and non-convex, discrete, stochastic, robust)
Submission Number: 8652
Loading