NeurInt: Learning to Interpolate through Neural ODEsDownload PDF

Published: 17 Oct 2021, Last Modified: 05 May 2023DLDE Workshop -- NeurIPS 2021 SpotlightReaders: Everyone
Keywords: Neural ODEs, Non-Parametric Deep Generative Models
Abstract: A range of applications require learning image generation models whose latent space effectively captures the high-level factors of variation in the data distribution, which can be judged by its ability to interpolate between images smoothly. However, most generative models mapping a fixed prior to the generated images lead to interpolation trajectories lacking smoothness and images of reduced quality. We propose a novel generative model that learns a flexible non-parametric prior over interpolation trajectories, conditioned on a pair of source and target images. Instead of relying on deterministic interpolation methods like linear or spherical interpolation in latent space, we devise a framework that learns a distribution of trajectories between two given images using Latent Second-Order Neural Ordinary Differential Equations. Through a hybrid combination of reconstruction and adversarial losses, the generator is trained to map the sampled points from these trajectories to sequences of realistic images of improved quality that smoothly transition from the source to the target image.
Publication Status: This work is unpublished.
4 Replies

Loading