Disentangled Latent Spaces Facilitate Data-Driven Auxiliary Learning

Published: 07 May 2025, Last Modified: 29 May 2025VisCon 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Representation Learning, Auxiliary Learning, Multi-task Learning, Disentanglement
TL;DR: We introduce a representation learning framework relying on weakly supervised disentanglement to create an auxiliary task unrelated to the primary one, which enhances performance when used downstream with an arbitrary Multi-Task Learning model.
Abstract: Training a neural network on additional auxiliary tasks can improve learning when data is scarce or the principal task is highly complex. This idea is primarily inspired by the improved generalization induced by solving multiple tasks simultaneously, which leads to a more robust shared representation. However, selecting optimal auxiliary tasks often requires hand-crafted solutions or costly meta-learning approaches. We propose Detaux, a novel framework that discovers an auxiliary classification task through weakly supervised disentanglement in a product manifold. Our method isolates variation in the data related to the principal task in a dedicated subspace while producing orthogonal subspaces with high separability. A clustering procedure in the most disentangled subspace generates discrete auxiliary labels that, along with the original data, can be used within any Multi-Task Learning (MTL) framework. Theoretical evidence on the linear independence of task representations, alongside experiments on synthetic and real data, demonstrates the potential to link disentangled representations and MTL
Submission Number: 5
Loading