Factor-Wise Homogeneity of Slot-Attention for Continual Object-Centric Learning

19 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Object-Centric Learning, Representation Learning, Continual Learning
TL;DR: We reveal that Slot Attention induces factor-wise homogeneous representations that offers significant advantages for continual object-centric learning.
Abstract: Can current AI models continually learn object-centric representations? Object-Centric Learning and Continual Learning are both critical areas of AI research, yet their intersection remains underexplored. In this work, we observe that Slot Attention, a popular OCL method, exhibits a distinctive behavior: It organizes latent representations into small and separated regions, each of which preserves the same factor states, referred to as \textit{factor-wise homogeneity}. This phenomenon emerges not only in previously trained data but also in upcoming data with unseen factor states, offering significant advantages for continual learning that incrementally expands factor states, such as novel shapes. To harness this property, we propose a simple and effective method, \textit{Decoder only Post Replay}, that freezes the encoder and the Slot Attention as a generator of factor-wise homogeneous representations and employs a decoder-only fine-tuning strategy after the novel task training is done. Although Slot Attention has been widely studied, its representational behavior has been largely overlooked. This paper highlights its unique strengths in continual object-centric learning. We also introduce a novel validation and analysis environment for Continual-Object Centric Learning, establishing a strong baseline for future research.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 14853
Loading