LaplaceRedux
Effortless Bayesian Deep Learning through Laplace Approximation for Flux.jl neural networks.
Science Score: 36.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (15.0%) to scientific vocabulary
Keywords
Repository
Effortless Bayesian Deep Learning through Laplace Approximation for Flux.jl neural networks.
Basic Info
- Host: GitHub
- Owner: JuliaTrustworthyAI
- License: mit
- Language: Julia
- Default Branch: main
- Homepage: https://www.taija.org/LaplaceRedux.jl/
- Size: 119 MB
Statistics
- Stars: 47
- Watchers: 2
- Forks: 4
- Open Issues: 19
- Releases: 15
Topics
Metadata Files
README.md

LaplaceRedux
LaplaceRedux.jl is a library written in pure Julia that can be used for effortless Bayesian Deep Learning through Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python library and its companion paper (Daxberger et al. 2021).
🚩 Installation
The stable version of this package can be installed as follows:
julia
using Pkg
Pkg.add("LaplaceRedux.jl")
The development version can be installed like so:
julia
using Pkg
Pkg.add("https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl")
🏃 Getting Started
If you are new to Deep Learning in Julia or simply prefer learning through videos, check out this awesome YouTube tutorial by doggo.jl 🐶. Additionally, you can also find a video of my presentation at JuliaCon 2022 on YouTube.
🖥️ Basic Usage
LaplaceRedux.jl can be used for any neural network trained in Flux.jl. Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.
Regression
A complete worked example for a regression model can be found in the docs. Here we jump straight to Laplace Approximation and take the pre-trained model nn as given. Then LA can be implemented as follows, where we specify the model likelihood. The plot shows the fitted values overlaid with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.
julia
la = Laplace(nn; likelihood=:regression)
fit!(la, data)
optimize_prior!(la)
plot(la, X, y; zoom=-5, size=(500,500))
Binary Classification
Once again we jump straight to LA and refer to the docs for a complete worked example involving binary classification. In this case we need to specify likelihood=:classification. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the Plugin Approximation on the left compares to the Laplace Approximation on the right.
``` julia la = Laplace(nn; likelihood=:classification) fit!(la, data) launtuned = deepcopy(la) # saving for plotting optimizeprior!(la; n_steps=100)
Plot the posterior predictive distribution:
zoom=0 pplugin = plot(la, X, ys; title="Plugin", linkapprox=:plugin, clim=(0,1)) puntuned = plot(launtuned, X, ys; title="LA - raw (λ=$(unique(diag(launtuned.prior.P₀))[1]))", clim=(0,1), zoom=zoom) plaplace = plot(la, X, ys; title="LA - tuned (λ=$(round(unique(diag(la.prior.P₀))[1],digits=2)))", clim=(0,1), zoom=zoom) plot(pplugin, puntuned, p_laplace, layout=(1,3), size=(1700,400)) ```
📢 JuliaCon 2022
This project was presented at JuliaCon 2022 in July 2022. See here for details.
🛠️ Contribute
Contributions are very much welcome! Please follow the SciML ColPrac guide. You may want to start by having a look at any open issues.
🎓 References
Daxberger, Erik, Agustinus Kristiadi, Alexander Immer, Runa Eschenhagen, Matthias Bauer, and Philipp Hennig. 2021. “Laplace Redux-Effortless Bayesian Deep Learning.” Advances in Neural Information Processing Systems 34.
Owner
- Name: Taija
- Login: JuliaTrustworthyAI
- Kind: organization
- Location: Netherlands
- Repositories: 2
- Profile: https://github.com/JuliaTrustworthyAI
Home for repositories of the Taija (Trustworthy Artifical Intelligence in Julia) project.
GitHub Events
Total
- Create event: 7
- Commit comment event: 2
- Release event: 1
- Issues event: 9
- Watch event: 8
- Delete event: 1
- Issue comment event: 33
- Push event: 46
- Pull request event: 12
- Pull request review comment event: 17
- Pull request review event: 27
- Fork event: 2
Last Year
- Create event: 7
- Commit comment event: 2
- Release event: 1
- Issues event: 9
- Watch event: 8
- Delete event: 1
- Issue comment event: 33
- Push event: 46
- Pull request event: 12
- Pull request review comment event: 17
- Pull request review event: 27
- Fork event: 2
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 64
- Total pull requests: 76
- Average time to close issues: 3 months
- Average time to close pull requests: 14 days
- Total issue authors: 8
- Total pull request authors: 9
- Average comments per issue: 1.23
- Average comments per pull request: 1.49
- Merged pull requests: 45
- Bot issues: 0
- Bot pull requests: 34
Past Year
- Issues: 21
- Pull requests: 20
- Average time to close issues: 22 days
- Average time to close pull requests: 6 days
- Issue authors: 4
- Pull request authors: 4
- Average comments per issue: 1.76
- Average comments per pull request: 2.4
- Merged pull requests: 14
- Bot issues: 0
- Bot pull requests: 6
Top Authors
Issue Authors
- pat-alt (42)
- pasq-cat (7)
- Rockdeldiablo (6)
- ablaom (1)
- DoktorMike (1)
- patrickm663 (1)
- caxelrud (1)
- JuliaTagBot (1)
Pull Request Authors
- github-actions[bot] (38)
- pat-alt (37)
- pasq-cat (13)
- Rockdeldiablo (7)
- ablaom (4)
- severinbratus (3)
- MarkArdman (1)
- pitmonticone (1)
- checkdgt (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- julia 7 total
- Total dependent packages: 2
- Total dependent repositories: 0
- Total versions: 16
juliahub.com: LaplaceRedux
Effortless Bayesian Deep Learning through Laplace Approximation for Flux.jl neural networks.
- Homepage: https://www.taija.org/LaplaceRedux.jl/
- Documentation: https://docs.juliahub.com/General/LaplaceRedux/stable/
- License: MIT
-
Latest release: 1.2.0
published about 1 year ago
Rankings
Dependencies
- actions/checkout v2 composite
- codecov/codecov-action v2 composite
- julia-actions/cache v1 composite
- julia-actions/julia-buildpkg v1 composite
- julia-actions/julia-docdeploy v1 composite
- julia-actions/julia-processcoverage v1 composite
- julia-actions/julia-runtest v1 composite
- julia-actions/setup-julia v1 composite
- JuliaRegistries/TagBot v1 composite
- actions/checkout v1 composite
- julia-actions/setup-julia latest composite
- reviewdog/action-suggester v1 composite