Tensor self-representation network for subspace clustering via alternating direction method of multipliers

Published: 01 Jan 2025, Last Modified: 19 Feb 2025Knowl. Based Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Deep subspace clustering methods based on data self-representation have become highly popular owing to their ability to automatically discover more suitable feature spaces. However, many existing subspace clustering methods meet matrix-based self-expressiveness requirements by directly converting the feature tensor into a matrix. This approach results in a loss of spatial structure information inherent in the tensor. To overcome this problem, we propose the tensor self-representation network (TSRN). TSRN introduces the tensor mode-d product to apply matrix-based self-expressiveness to the tensor, thereby preserving the spatial information of the feature tensor. This method fully utilizes the channel, height and width information within the feature tensor, resulting in a self-expressive coefficient matrix that is more suitable for clustering. To capture the intrinsic representation in the low-dimensional space, we impose low-rank constraints in each mode. This results in a deeper network, thereby increasing the difficulty of training. To optimize TSRN effectively, we introduce auxiliary variables and leverage the alternating direction method of multipliers (ADMM) to decompose the problem into two more manageable subproblems. This enables us to obtain a deep low-rank representation by efficiently training a shallower network. TSRN outperforms other traditional and deep subspace clustering methods on several benchmark datasets. Notably, on the COIL20 dataset, TSRN achieves an accuracy of 100%, demonstrating its effectiveness. Ablation studies report that the application of the tensor mode-d product is effective to obtain a good representation of the data and that ADMM is well suited for optimizing the variable coupling problem brought by the tensor mode-d product.
Loading