NormalizingFlows

Implementation of normalizing flows compatible with Bijectors.jl

https://github.com/turinglang/normalizingflows.jl

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.8%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Implementation of normalizing flows compatible with Bijectors.jl

Basic Info
Statistics
  • Stars: 42
  • Watchers: 8
  • Forks: 5
  • Open Issues: 10
  • Releases: 3
Created over 2 years ago · Last pushed 6 months ago
Metadata Files
Readme License Citation

README.md

NormalizingFlows.jl

Dev Build Status

Last updated: 2025-Aug-08

A normalizing flow library for Julia.

The purpose of this package is to provide a simple and flexible interface for variational inference (VI) and normalizing flows (NF) for Bayesian computation or generative modeling. The key focus is to ensure modularity and extensibility, so that users can easily construct (e.g., define customized flow layers) and combine various components (e.g., choose different VI objectives or gradient estimates) for variational approximation of general target distributions, without being tied to specific probabilistic programming frameworks or applications.

See the documentation for more.

We also provide several demos and examples in example.

Installation

To install the package, run the following command in the Julia REPL: julia ] # enter Pkg mode (@v1.11) pkg> add NormalizingFlows Then simply run the following command to use the package: julia using NormalizingFlows

Quick recap of normalizing flows

Normalizing flows transform a simple reference distribution $q_0$ (sometimes referred to as the base distribution) to a complex distribution $q$ using invertible functions.

In more details, given the base distribution, usually a standard Gaussian distribution, i.e., $q0 = \mathcal{N}(0, I)$, we apply a series of parameterized invertible transformations (called flow layers), $T{1, \theta1}, \cdots, T{N, \thetak}$, yielding that ```math ZN = T{N, \thetaN} \circ \cdots \circ T{1, \theta1} (Z0) , \quad Z0 \sim q0,\quad ZN \sim q{\theta}, ``` where $\theta = (\theta1, \dots, \thetaN)$ is the parameter to be learned, and $q{\theta}$ is the variational distribution (flow distribution). This describes sampling procedure of normalizing flows, which requires sending draws through a forward pass of these flow layers.

Since all the transformations are invertible (technically diffeomorphic), we can evaluate the density of a normalizing flow distribution $q{\theta}$ by the change of variable formula: ```math q\theta(x)=\frac{q0\left(T1^{-1} \circ \cdots \circ TN^{-1}(x)\right)}{\prod{n=1}^N Jn\left(Tn^{-1} \circ \cdots \circ TN^{-1}(x)\right)} \quad Jn(x)=\left|\text{det} \nablax Tn(x)\right|. ``` Here we drop the subscript $\theta_n, n = 1, \dots, N$ for simplicity. Density evaluation of normalizing flow requires computing the inverse and the Jacobian determinant of each flow layer.

Given the feasibility of i.i.d. sampling and density evaluation, normalizing flows can be trained by minimizing some statistical distances to the target distribution $p$. The typical choice of the statistical distance is the forward and backward Kullback-Leibler (KL) divergence, which leads to the following optimization problems: math \begin{aligned} \text{Reverse KL:}\quad &\arg\min _{\theta} \mathbb{E}_{q_{\theta}}\left[\log q_{\theta}(Z)-\log p(Z)\right] \\ &= \arg\min _{\theta} \mathbb{E}_{q_0}\left[\log \frac{q_\theta(T_N\circ \cdots \circ T_1(Z_0))}{p(T_N\circ \cdots \circ T_1(Z_0))}\right] \\ &= \arg\max _{\theta} \mathbb{E}_{q_0}\left[ \log p\left(T_N \circ \cdots \circ T_1(Z_0)\right)-\log q_0(Z_0)+\sum_{n=1}^N \log J_n\left(T_n \circ \cdots \circ T_1(Z_0)\right)\right] \end{aligned} and math \begin{aligned} \text{Forward KL:}\quad &\arg\min _{\theta} \mathbb{E}_{p}\left[\log q_{\theta}(Z)-\log p(Z)\right] \\ &= \arg\min _{\theta} \mathbb{E}_{p}\left[\log q_\theta(Z)\right] \end{aligned} Both problems can be solved via standard stochastic optimization algorithms, such as stochastic gradient descent (SGD) and its variants.

Reverse KL minimization is typically used for Bayesian computation, where one wants to approximate a posterior distribution $p$ that is only known up to a normalizing constant. In contrast, forward KL minimization is typically used for generative modeling, where one wants to learn the underlying distribution of some data.

Current status and to-dos

  • [x] general interface development
  • [x] documentation
  • [x] including more NF examples/Tutorials PR#11
  • [x] GPU compatibility PR#25
  • [ ] integrating Lux.jl and Reactant.jl. This could potentially solve the GPU compatibility issue as well.
  • [ ] benchmarking

Related packages

Owner

  • Name: The Turing Language
  • Login: TuringLang
  • Kind: organization

Bayesian inference with probabilistic programming

Citation (CITATION.bib)

@misc{NormalizingFlows.jl,
	author  = {Zuheng Xu, Xianda Sun, Tor Erlend Fjelde, Hong Ge and contributors},
	title   = {NormalizingFlows.jl},
	url     = {https://github.com/TuringLang/NormalizingFlows.jl},
	version = {v0.1.0},
	year    = {2023},
	month   = {6}
}

GitHub Events

Total
  • Create event: 11
  • Commit comment event: 4
  • Release event: 2
  • Issues event: 9
  • Watch event: 8
  • Delete event: 10
  • Issue comment event: 69
  • Push event: 144
  • Pull request review comment event: 98
  • Pull request review event: 62
  • Pull request event: 22
  • Fork event: 2
Last Year
  • Create event: 11
  • Commit comment event: 4
  • Release event: 2
  • Issues event: 9
  • Watch event: 8
  • Delete event: 10
  • Issue comment event: 69
  • Push event: 144
  • Pull request review comment event: 98
  • Pull request review event: 62
  • Pull request event: 22
  • Fork event: 2

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 16
  • Total pull requests: 46
  • Average time to close issues: 10 months
  • Average time to close pull requests: 2 months
  • Total issue authors: 6
  • Total pull request authors: 6
  • Average comments per issue: 1.81
  • Average comments per pull request: 2.52
  • Merged pull requests: 26
  • Bot issues: 0
  • Bot pull requests: 25
Past Year
  • Issues: 4
  • Pull requests: 20
  • Average time to close issues: 9 days
  • Average time to close pull requests: about 2 months
  • Issue authors: 3
  • Pull request authors: 3
  • Average comments per issue: 2.25
  • Average comments per pull request: 1.65
  • Merged pull requests: 7
  • Bot issues: 0
  • Bot pull requests: 13
Top Authors
Issue Authors
  • zuhengxu (8)
  • yebai (2)
  • Red-Portal (1)
  • avik-pal (1)
  • wsmoses (1)
  • herluc (1)
  • JuliaTagBot (1)
Pull Request Authors
  • github-actions[bot] (28)
  • zuhengxu (14)
  • devmotion (3)
  • torfjelde (2)
  • shravanngoswamii (2)
  • sunxd3 (1)
  • itsdfish (1)
Top Labels
Issue Labels
enhancement (1)
Pull Request Labels
enhancement (2) documentation (1)

Packages

  • Total packages: 1
  • Total downloads:
    • julia 12 total
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 3
juliahub.com: NormalizingFlows

Implementation of normalizing flows compatible with Bijectors.jl

  • Versions: 3
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 12 Total
Rankings
Dependent repos count: 10.2%
Stargazers count: 28.3%
Average: 29.1%
Dependent packages count: 37.6%
Forks count: 40.5%
Last synced: 6 months ago