Track: Full Paper (8 pages)
Keywords: Feature learning, Average Gradient Outer Product, Precision Matrix, Subspace Clustering, Kernel methods
Abstract: In recent studies, the *Average Gradient Outer Product* (AGOP) has emerged as a powerful tool to understand feature learning in deep neural networks, particularly in supervised learning tasks such as image classification. In this work, we extend this perspective to unsupervised learning, particularly the task of subspace clustering. Building on the existing kernel-based subspace clustering approaches, we introduce a feature learning mechanism which iteratively projects the training data onto an averaged precision matrix. Notably, the relevant feature learning matrix we derived is the inverse of the traditional AGOP matrix. We explain this from the viewpoint of isotropic variance control in the latent domain, and illustrate that the proposed projection mechanism refines the data distribution and orthogonalizes the data in the latent space. Empirically, we visualize the evolution of projected data distribution, kernel matrix, and the emergence of pronounced block-diagonal structure in affinity matrix on a toy example. Furthermore, our approach outperforms the state-of-the-art kernel-based subspace clustering method KTRR [Zhen et al., 2020] on the Extended Yale B dataset [Lee et al., 2005]. Full experiment implementation is available on [Github](https://github.com/HaohanZou/AGOP_subspace_clustering).
Supplementary Material: zip
Submission Number: 32
Loading