Conditional Permutation Invariant Flows

Published: 15 May 2023, Last Modified: 17 Sept 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We present a conditional generative probabilistic model of set-valued data with a tractable log density. This model is a continuous normalizing flow governed by permutation equivariant dynamics. These dynamics are driven by a learnable per-set-element term and pairwise interactions, both parametrized by deep neural networks. We illustrate the utility of this model via applications including (1) complex traffic scene generation conditioned on visually specified map information, and (2) object bounding box generation conditioned directly on images. We train our model by maximizing the expected likelihood of labeled conditional data under our flow, with the aid of a penalty that ensures the dynamics are smooth and hence efficiently solvable. Our method significantly outperforms non-permutation invariant baselines in terms of log likelihood and domain-specific metrics (offroad, collision, and combined infractions), yielding realistic samples that are difficult to distinguish from data.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: - The manuscript has been de-anonymized - The link to the codebase for data generation, and calculation of metrics has been added - The text at the bottom of page 2 has been rewritten to more accurately reflect contributions of the paper - At the bottom of page 10, a sentence has been added that explains why it be beneficial to have a tractable density for object detection models.
Code: https://github.com/inverted-ai/conditional-permutation-invariant-flows-datasets
Assigned Action Editor: ~Laurent_Dinh1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 829
Loading