Network Transplanting for the Functionally Modular Architecture

Published: 01 Jan 2023, Last Modified: 14 May 2025PRCV (3) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper focuses on the problem of transplanting category-and-task-specific neural networks to a generic, modular network without strong supervision. Unlike traditional deep neural networks (DNNs) with black-box representations, we design a functionally modular network architecture, which divides the entire DNN into several functionally meaningful modules. Like building LEGO blocks, we can teach the proposed DNN a new object category by directly transplanting the module corresponding to the object category from another DNN, with a few or even without sample annotations. Our method incrementally adds new categories to the DNN, which do not affect representations of existing categories. Such a strategy of incremental network transplanting can avoid the typical catastrophic-forgetting problem in continual learning. We further develop a back distillation method to overcome challenges of model optimization in network transplanting. In experiments, our method with much fewer training samples outperformed baselines.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview