Parameter Space Representation Learning on Mixed-type Data

26 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Representation learning; Parameter space; Diffusion model; Bayesian flow networks
Abstract: A significant challenge in representation learning is to capture latent semantics in data mixing continuous, discrete, and even discretized observations (called mixedtype data), encountering issues like inconsistent discoveries and redundant modeling. Recently, Bayesian flow networks (BFNs) offer a unified strategy to represent such mixed-type data in the parameter space but cannot learn low-dimensional latent semantics since BFNs assume the size of parameters being the same as that of observations. This raises a new important question: how to learn latent semantics in parameter spaces rather than in observation spaces of mixed-type data? Accordingly, we propose a novel unified parameter space representation learning framework, ParamReL, which extracts progressive latent semantics in parameter spaces of mixed-type data. In ParamReL, a self-encoder learns latent semantics from intermediate parameters rather than observations. The learned semantics are then integrated into BFNs to efficiently learn unified representations of mixed-type data. Additionally, a reverse-sampling procedure can empower BFNs for tasks including input reconstruction and interpolation. Extensive experiments verify the effectiveness of ParamReL in learning parameter space representations for latent interpolation, disentanglement, time-varying conditional reconstruction, and conditional generation. The code is available at https: //anonymous.4open.science/r/ICLR25-F087/README.md.
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6334
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview