Emergent Modularity in Pre-trained TransformersDownload PDF

22 Sept 2022 (modified: 22 Oct 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Pre-trained Transformers have shown the potential to realize the dream of general intelligence, encouraging researchers to explore the analogy between Transformers and human brains. These advances raise the question of whether Transformers have a modular structure similar to brain regions, where neurons are closely related and specialized in a certain function. In this work, we analyze the modularity of Transformers by studying the expert networks, which are clusters of neurons, in Mixture-of-Experts (MoE) Transformers. To evaluate the functional specialization of experts, we propose a novel framework to identify the functionality of both neurons and experts. We conduct empirical analyses on two representative pre-trained Transformers and find that (1) Transformer neurons are functionally specialized, which provides the necessary condition of modularity. (2) Transformer experts are modularized. There are functional experts, where clustered are the neurons specialized in a certain function. (3) The modular structure is stabilized at the early stage of pre-training, which is faster than the neuron stabilization. It reveals the coarse-to-fine mechanism of pre-training, which first constructs the coarse modular structure and then improves the fine-grained neuron functions. In summary, we explore the emergent modularity in pre-trained Transformers and hope to help the community better understand the working mechanism of Transformers. Our code and data will be released to facilitate future research.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Applications (eg, speech processing, computer vision, NLP)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2305.18390/code)
1 Reply

Loading