What are Normalizing Flows?

Basics of normalizing flows, a technique used in machine learning to build up complex probability distributions by transforming simple ones

Ankur Dhuriya
3 min readJan 27, 2022

Normalizing flows is a technique used in machine learning to build complex distributions from simple distributions. They have been applied in the context of generative modelling. They have become popular recently, and have received quite a lot of attention — for example Glow, by OpenAI — because of their immense power to model probability distributions. Flows have been widely used in speech processing, most notably through WaveGlow (NVIDIA), Glow-TTS

Flow-based Deep Generative Models

Basic mathematical framework of normalizing flows. Suppose we have a continuous random variable z with some simple distribution like isotropic Gaussian distribution allows for easy sampling and density evaluation. The key idea is to transform this simple distribution with some function f into a more complicated one, we formulate f as a composition of sequence of invertible transformations so that overall transformation is also invertible.

For every x we have exactly one z that could have been sampled to produce it, to find what density is for the given x, it depends on the behaviour of f.

So to understand how it works we have to look at how change of variables formula, when we have invertible and differentiable function mapping from some domain Z to X with the distribution p(z) defined over z in Z

A valid probability density function must always integrate to 1 over its domain, magnitude of Jacobian determinant which basically indicates how much a transformation locally expand or contract space is necessary to ensure that new density function satisfy p(x) this requirement

Exact likelihood evaluation is used to evaluate flow based models. The path traversed by the random variables z is the flow and the full chain formed by the successive transformations f is called a normalizing flow.

Generative modelling is all about approximating some observed data distributions natural images or speech audio, so this can be used to sample new data points with similar characteristics, or infer latent variables that allows to reduce the dimensionality of the data . One way to use normalizing flows in a latent variable in generative model is to parameterise the likelihood as a flow in our framework.

Reference

https://lilianweng.github.io/lil-log/2018/10/13/flow-based-deep-generative-models.html

--

--