ADSO: Adaptive Data Mixture & Scale Optimization. A Multi-Scale Multi-Fidelity Bayesian Optimization Approach.

Published: 06 Mar 2025, Last Modified: 30 Apr 2025ICLR 2025 Workshop Data Problems PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Data Mixture, Adaptive Optimization, Bayes Optimization.
TL;DR: Multi-Fidelity Multi-Scale Bayes Opt on Data Mixtures, showing 2.7x to 6x speedup compared to random search and multi-fidelity BO.
Abstract:

LLM pre-training requires careful curation of data sources, a process that currently relies heavily on intuition or costly trial-and-error. Since existing ad hoc approaches are unlikely to transfer across domains or data types, we present a unifying framework for data mixture optimization where (mixtures, model scale, training steps) are chosen to balance cost and potential information gain. Going beyond the canonical deterministic extrapolation in scaling laws, we present a sequential decision-making framework where uncertainty in outcomes is explicitly modeled and sharpened as more measurements are gathered. In particular, we formulate a multi-scale, multi-fidelity Bayesian Optimization (BO) problem where information from smaller-scale experiments can systematically inform larger-scale training decisions. We design an adaptive algorithm that takes into account different measurement fidelities provided by model scale and training steps and empirically demonstrate it on a predictor built on 472 pre-training runs with varying data compositions. Compared to standard BO baselines, instantiating our approach with even simple kernels and acquisition functions can allow principled decisions across training models from 20M to 1B parameters and achieve \textbf{2.7x} and \textbf{6x} speedups compared to multi-fidelity BO and random search baselines in finding the best data mixture for downstream performance under fixed compute budgets. In sum, our adaptive framework underscores potential efficiency gains achievable by developing principled and transferrable data mixture optimization methods. Our code is publicly available at \url{https://github.com/anonWAEWA/ADSO}.

Submission Number: 78
Loading