Not Every AI Problem is a Data Problem: We Should Be Intentional About Data Scaling

Published: 01 Jan 2025, Last Modified: 13 May 2025CoRR 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: While Large Language Models require more and more data to train and scale, rather than looking for any data to acquire, we should consider what types of tasks are more likely to benefit from data scaling. We should be intentional in our data acquisition. We argue that the topology of data itself informs which tasks to prioritize in data scaling, and shapes the development of the next generation of compute paradigms for tasks where data scaling is inefficient, or even insufficient.
Loading