NonparametricVI
Particle-based and nonparametric variational methods for approximate Bayesian inference and Probabilistic Programming
Science Score: 67.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 3 DOI reference(s) in README -
✓Academic publication links
Links to: arxiv.org, zenodo.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (14.3%) to scientific vocabulary
Keywords
Repository
Particle-based and nonparametric variational methods for approximate Bayesian inference and Probabilistic Programming
Basic Info
- Host: GitHub
- Owner: BayesianRL
- License: mit
- Language: Julia
- Default Branch: main
- Homepage: https://bayesianrl.github.io/NonparametricVI.jl/
- Size: 13.1 MB
Statistics
- Stars: 14
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 4
Topics
Metadata Files
README.md
NonparametricVI.jl
NonparametricVI.jl is a collection of particle-based and nonparametric variational methods for approximate Bayesian inference in Julia. You can use it either with Turing.jl probabilistic programming language or other custom sampling problems defined by LogDensityProblems.jl. Another application of this package is to improve the quality of samples obtained from other methods, For example most MCMC methods tend to produce correlated samples leading to a low effective sample size. In such cases, the samples can be decorrelated using any suitable particle-based approach.
From Parametric to Nonparametric Variational Inference
The idea of Variational Inference (VI) is to approximate a target posterior density using a parametric family of probability distibutions by choosing the most fitting member of this family through solving an optimization problem. This approach turns out to be more scalable than MCMC methods especially for models with large number of latent variables. However the main challenge with standard VI is choosing the suitable parametric family. Very simple densities can underestimate the posterior while more complex choices can be computationally infeasible. On the other hand, nonparametric or particle-based VI methods do not require a parametric family and instead try to approximate the posterior by arranging the positions of a set of particles in a way that the particles resemble samples from the target density.
https://github.com/user-attachments/assets/3dc29684-2642-4dd2-8be3-3e402de744d2
Getting Started
Installation
NonparametricVI.jl is under development, you can install the latest version from this repository using Pkg:
julia
Pkg.add(url="https://github.com/BayesianRL/NonparametricVI.jl.git")
Or the latest registered version from Julia general repository:
julia
Pkg.add("NonparametricVI")
Using with Turing.jl Probabilistic Programs
Example: Linear Regression
Let's craft a toy regression problem: ```julia using DynamicPPL using Distributions using NonparametricVI using LinearAlgebra using KernelFunctions using CairoMakie
n = 100 X = 2rand(n) .- 1.0 y = 3X .+ 1 + randn(n) ```
The generated problem looks like this:
```julia @model function bayesian_regression(X, y) α ~ Normal(0.0, 1.0) β ~ Normal(0.0, 1.0)
for i in eachindex(y)
y[i] ~ Normal(α * X[i] + β, 0.5)
end
end
model = bayesian_regression(X, y) ```
To define the dynamics of Stein Variational Gradient Descent (SVGD), we need a positive-definite kernel. You can use all kernels provided by KernelFunctions.jl. We use a squared exponential kernel. For more details on designing more complex kernels, check out KernelFunctions.jl documentation:
julia
using KernelFunctions
kernel = SqExponentialKernel()
Next we define the parameters of SVGD:
julia
dynamics = SVGD(K=kernel, η=0.003, batchsize=32)
Nonparametric Variational Inference methods use a set of particles instead of a parametric family of distribution to approximate posterior (or any target) distribution. The init method creates the particles pc, in addition to an internal context ctx which will be used by the inference procedure.
julia
pc, ctx = init(model, dynamics; n_particles=128)
pc is a simple struct containing position of particles. Using get_samples we can access the particles and plot them:
julia
samples = get_samples(pc, ctx)
α_samples = [s[@varname(α)] for s in samples]
β_samples = [s[@varname(β)] for s in samples];
Note that some Turing models may contain constrained parameters (e.g. positive, bounded, ...) while most inference methods are performed on an unconstrained space obtained by transforming the original denisty of parameters. The get_samples method transforms the particle positions back to the contrained space. Before running SVGD we can visualize the current state of particles:
By default the initial particles will be sampled from prior. One can use other approaches,
for example we can initialize particles with 10 steps of Langevin dynamics with an step size of 0.002:
```julia pc, ctx = NonparametricVI.init(model, dynamics; nparticles=128, particleinitializer=LangevinInitializer(0.002, 10))
```
With Langevin initialization, particles will look like this:
Intuitively, Langevin dynamics is very local and only uses attraction forces for adjusting particles and to prevent particles from collapsing on a model relies on an additive Gaussian noise. SVGD not only uses attraction but also uses repulsion forces to transport the particles in order to improve the quality of samples. While SVGD can be used standalone (see the next example), it is computationally more expensive so it is sometimes a good idea to initialize particles with a simpler dynamics like Langevin.
Note the infer! method modifies the particles in-place.
julia
infer!(pc, ctx; iters=50)
After collecting samples with get_samples we can visualize the final result:
Using with LogDensityProblems
Example: A Mixture Density
In addtion to Turing programs, you can use NonparametricVI for a custom Bayesian inference problem by implementing the LogDensityProblems interface. For example here we define a toy unnormalized mixture density:
```julia
using LogDensityProblems
struct MixtureDensity end
function LogDensityProblems.capabilities(::Type{<:MixtureDensity}) LogDensityProblems.LogDensityOrder{0}() end
LogDensityProblems.dimension(::MixtureDensity) = 2
function LogDensityProblems.logdensity(::MixtureDensity, x) log(0.25 * exp(-1/0.5 * norm(x-[-1.5, -1.5])^2) + 0.25 * exp(-1/0.5 * norm(x-[-1.5, 1.5])^2) + 0.25 * exp(-1/0.5 * norm(x-[ 1.5, -1.5])^2) + 0.25 * exp(-1/0.5 * norm(x-[ 1.5, 1.5])^2)) end
ρ = MixtureDensity() ```
Next we define the inference dynamics by choosing a custom kernel. It can be any kernel provided by KernelFunctions.jl. Here we use a scaled version of the squared exponential kernel:
julia
kernel = SqExponentialKernel() ∘ ScaleTransform(2.0)
dynamics = SVGD(K=kernel, η=0.4, batchsize=16)
Now we create a set of particles that represent samples:
julia
pc, ctx = init(ρ, dynamics; n_particles=512)
We can access particle positions by get_samples and visualize the their current position:
julia
S = get_samples(pc)
Obviously the initial samples does not match the target density. Now we run the SVGD dynamics to adjust the samples:
julia
report = infer!(pc, ctx; iters=150, track=Dict(
"KSD" => KernelizedSteinDiscrepancy(kernel, 64)
));
S = get_samples(pc)
The above code also tracks the value of Kernelized Stein Discrepancy (KSD) during inference. Since KSD can be expensive to compute, we use a Monte Carlo estimation with 64 particles sampled at each step. After inference we can access the tracked values using report.metrics["KSD"] and plot it:
Finally we can check the terminal position of particles:
Implemented Methods
| Method | 📝 Paper | Support | Notes |
|----------------------------|---------------------------------------------------------|---------------|---------------------|
| Stein Variational Gradient Descent | 📔 Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm ✏️ Qiang Liu, Dilin Wang | ✅ Basic functionality | Accuracy is sensitive to kernel choice. see SVGD |
| Stein Variational Newton method | 📔 A Stein variational Newton method ✏️ Gianluca Detommaso, Tiangang Cui, Alessio Spantini, Youssef Marzouk, Robert Scheichl | 🚧 todo | |
| Projected Stein Variational Newton | 📔 Projected Stein Variational Newton: A Fast and Scalable Bayesian Inference Method in High Dimensions ✏️ Peng Chen, Keyi Wu, Joshua Chen, Thomas O'Leary-Roseberry, Omar Ghattas | 🚧 todo | |
| Stein Self-Repulsive Dynamics | 📔 Stein Self-Repulsive Dynamics: Benefits From Past Samples ✏️ Mao Ye, Tongzheng Ren, Qiang Liu | 🚧 todo | |
| SPH-ParVI | 📔 Variational Inference via Smoothed Particle Hydrodynamics ✏️ Yongchao Huang | 🚧 todo | |
| MPM-ParVI | 📔 Variational Inference Using Material Point Method ✏️ Yongchao Huang | 🚧 todo | |
| EParVI | 📔 Electrostatics-based particle sampling and approximate inference ✏️ Yongchao Huang | 🚧 todo | |
Performance Tracking (v0.1.0)
Here you can find performance evaluations for inference methods and also trackable metrics like KSD like compared to the previous versions:
Inference Methods
SVGD
Metrics
KSD
About
If you found this project useful in your research, please cite it as follows:
BibTeX
@software{NonparametricVI,
author = {Asadi, Amirabbas},
doi = {10.5281/zenodo.15154383},
title = {{NonparametricVI, Particle-Based and Nonparametric Variational Methods for Bayesian Inference}},
url = {https://github.com/BayesianRL/NonparametricVI.jl},
version = {0.1.0},
year = {2025}
}
Owner
- Name: BayesianRL
- Login: BayesianRL
- Kind: organization
- Repositories: 1
- Profile: https://github.com/BayesianRL
Citation (CITATION.cff)
cff-version: 1.2.0 message: "If you found this software useful in your research, please cite it as below." authors: - family-names: "Asadi" given-names: "Amirabbas" orcid: "https://orcid.org/0000-0002-7421-1420" title: "NonparametricVI, Particle-Based and Nonparametric Variational Methods for Bayesian Inference" version: 0.1.0 doi: 10.5281/zenodo.15154383 date-released: 2025-04-05 url: "https://github.com/BayesianRL/NonparametricVI.jl"
GitHub Events
Total
- Create event: 6
- Commit comment event: 10
- Issues event: 2
- Release event: 4
- Watch event: 14
- Issue comment event: 4
- Public event: 1
- Push event: 85
- Pull request event: 1
Last Year
- Create event: 6
- Commit comment event: 10
- Issues event: 2
- Release event: 4
- Watch event: 14
- Issue comment event: 4
- Public event: 1
- Push event: 85
- Pull request event: 1
Issues and Pull Requests
Last synced: 5 months ago
All Time
- Total issues: 1
- Total pull requests: 0
- Average time to close issues: less than a minute
- Average time to close pull requests: N/A
- Total issue authors: 1
- Total pull request authors: 0
- Average comments per issue: 4.0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 1
- Pull requests: 0
- Average time to close issues: less than a minute
- Average time to close pull requests: N/A
- Issue authors: 1
- Pull request authors: 0
- Average comments per issue: 4.0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- JuliaTagBot (1)
Pull Request Authors
- dependabot[bot] (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
- Total downloads: unknown
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 4
juliahub.com: NonparametricVI
Particle-based and nonparametric variational methods for approximate Bayesian inference and Probabilistic Programming
- Homepage: https://bayesianrl.github.io/NonparametricVI.jl/
- Documentation: https://docs.juliahub.com/General/NonparametricVI/stable/
- License: MIT
-
Latest release: 0.2.2
published 8 months ago
Rankings
Dependencies
- JuliaRegistries/TagBot v1 composite
- actions/checkout v4 composite
- julia-actions/cache v2 composite
- julia-actions/setup-julia v2 composite
- actions/checkout v4 composite
- julia-actions/cache v2 composite
- julia-actions/julia-buildpkg v1 composite
- julia-actions/julia-runtest v1 composite
- julia-actions/setup-julia v2 composite