Neural Network Weights as a New Data Modality

Published: 03 Dec 2024, Last Modified: 03 Dec 2024ICLR 2025 Workshop ProposalsEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Weight space learning, foundation models, representation learning, parameter symmetries
TL;DR: Bringing together the scattered sub-communities that already interface with model weights, with the ultimate goal of democratizing model weights as a proper data modality.
Abstract: The ongoing deep learning revolution of the last decade has brought about hundreds of millions of neural networks (NNs) trained on diverse datasets. At the same time, the recent rise of foundation models has led to a rapid increase in the number of publicly available neural network models. On Hugging Face alone, there are over a million models, with thousands more added daily. As a result, the ample knowledge contained in the data, the abstraction learned via training, as well the trained models' behaviours themselves are stored in the architectures and parameters of trained NNs. Despite this massive growth, little research has been conducted into processing model weights, and they are rarely considered a data modality. This workshop aims to create a community around Weight Space Learning by bringing together the scattered sub-communities that already interface with model weights, with the ultimate goal of democratizing model weights as a proper data modality.
Submission Number: 16
Loading