Flexible Tails for Normalizing Flows

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Normalizing flows are a flexible class of probability distributions, expressed as transformations of a simple base distribution. A limitation of standard normalizing flows is representing distributions with heavy tails, which arise in applications to both density estimation and variational inference. A popular current solution to this problem is to use a heavy tailed base distribution. We argue this can lead to poor performance due to the difficulty of optimising neural networks, such as normalizing flows, under heavy tailed input. We propose an alternative, "tail transform flow'' (TTF), which uses a Gaussian base distribution and a final transformation layer which can produce heavy tails. Experimental results show this approach outperforms current methods, especially when the target distribution has large dimension or tail weight.
Lay Summary: Modern machine learning methods often represent uncertainty by learning how to turn simple random inputs into outputs that resemble observed data. These methods are very flexible, but they systematically underestimate the chance of extreme outcomes. Yet in many real world scenarios, extreme events are crucial, for example, severe weather or financial shocks. Some recent proposals solve this problem by including extreme events in the random input. But this can introduce new problems, as models struggle to handle extreme inputs effectively. We take a different approach: we add a dedicated step at the end of the process, designed to allow models to generate extreme events from non-extreme inputs. This simple change improves the modelling of data that contains extreme observations.
Link To Code: https://github.com/Tennessee-Wallaceh/tailnflows
Primary Area: Probabilistic Methods->Everything Else
Keywords: normalizing flows, extreme values, heavy tails, variational inference
Submission Number: 4908
Loading