Modular, Collaborative and Decentralized Deep Learning

Published: 03 Dec 2024, Last Modified: 03 Dec 2024ICLR 2025 Workshop ProposalsEveryoneRevisionsBibTeXCC BY 4.0
Keywords: modularity, mixture-of-experts, model merging, decentralized training
Abstract: The increasing complexity of modern machine learning models exposes the limitations of the traditional, monolithic approach to their development, raising concerns about cost and sustainability. This workshop challenges this approach by advocating for a new paradigm based on modular design and functional specialization. Inspired by principles from software engineering, we envision a future where models are composed of independently trainable modules, enabling asynchronous development, incremental updates, and cross-task generalization through composability. This shift towards modularity unlocks new possibilities for collaborative model development where researchers can contribute specialized modules, combine existing models, and participate in decentralized training schemes. By embracing modularity, we can democratize deep learning research, enabling smaller teams and institutions to contribute to the development of powerful and efficient models. Furthermore, modularity promises to enhance model interpretability, and maintainability, paving the way for more robust and efficient AI systems. This workshop aims to accelerate this transition towards a more collaborative and sustainable future for deep learning.
Submission Number: 81
Loading