https://github.com/baggepinnen/turing2montecarlomeasurements.jl

Interface between Turing.jl and MonteCarloMeasurements.jl

https://github.com/baggepinnen/turing2montecarlomeasurements.jl

Science Score: 23.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (10.2%) to scientific vocabulary

Keywords

bayesian-inference bayesian-statistics mcmc probabilistic-programming uncertainty-propagation visualization
Last synced: 5 months ago · JSON representation

Repository

Interface between Turing.jl and MonteCarloMeasurements.jl

Basic Info
  • Host: GitHub
  • Owner: baggepinnen
  • License: mit
  • Language: Julia
  • Default Branch: master
  • Size: 35.2 KB
Statistics
  • Stars: 5
  • Watchers: 3
  • Forks: 0
  • Open Issues: 1
  • Releases: 0
Topics
bayesian-inference bayesian-statistics mcmc probabilistic-programming uncertainty-propagation visualization
Created about 6 years ago · Last pushed over 1 year ago
Metadata Files
Readme License

README.md

Turing2MonteCarloMeasurements

Build Status Codecov arXiv article

This package serves as an interface between Turing.jl and MonteCarloMeasurements.jl. Turing, as a probabilistic programming language and MCMC inference engine, produces results in the form of a Chain, a type that internally contains all the samples produced during inference. This chain is a bit awkward to work with in its natural form, why this package exists and allows for the conversion of a chain to a named tuple of Particles from MonteCarloMeasurements.jl.

Visualization

In this example, we simulate a review process where a number of reviewers are assigning scores to a number of articles. The generation of the data and the model specification are hidden under the collapsed section below.

Generate fake data and specify a model

```julia using Turing, Distributions, Plots, Turing2MonteCarloMeasurements

nr = 5 # Number of reviewers na = 10 # Number of articles reviewerbias = rand(Normal(0,1), nr) articlescore = rand(Normal(0,2), na) R = clamp.([rand(Normal(r+a, 0.1)) for r in reviewerbias, a in articlescore], -5, 5)

Rmask = rand(Bool, size(R)) R = Rmask .* R R = replace(Rmask, 0=>missing) .* R

m = @model reviewscore(R,nr,na) = begin reviewerbias = Array{Real}(undef, nr) reviewergain = Array{Real}(undef, nr) truearticlescore = Array{Real}(undef, na) reviewerpopbias ~ Normal(0,1) reviewerpopgain ~ Normal(1,1) for i = 1:nr reviewerbias[i] ~ Normal(reviewerpopbias,1) reviewergain[i] ~ Normal(reviewerpopgain,1) end for j = 1:na truearticlescore[j] ~ Normal(0,2.5) end rσ ~ TruncatedNormal(1,10,0,100) for j = 1:na for i = 1:nr R[i,j] ~ Normal(reviewerbias[i] + truearticlescore[j] + reviewergain[i]*truearticlescore[j], rσ) end end end ```

We now focus on how to analyze the inference result. The chain is easily converted using the function Particles ```julia julia> chain = sample(reviewscore(R,nr,na), HMC(0.05, 10), 1500);

julia> cp = Particles(chain, crop=500); # crop discards the first 500 samples

julia> cp.reviewerpopbias Part1000(0.2605 ± 0.72)

julia> cp.reviewerpopgain Part1000(0.1831 ± 0.62) Particles can be plotted julia plot(cp.reviewerpopbias, title="Reviewer population bias") ![window](figs/rev_bias.svg) julia f1 = bar(articlescore, lab="Data", xlabel="Article number", ylabel="Article score", xticks=1:na) errorbarplot!(1:na, cp.truearticlescore, 0.8, seriestype=:scatter) f2 = bar(reviewerbias, lab="Data", xlabel="Reviewer number", ylabel="Reviewer bias") errorbarplot!(1:nr, cp.reviewer_bias, seriestype=:scatter, xticks=1:nr) plot(f1,f2) ``` window

Prediction

The linear-regression tutorial for Turing contains instructions on how to do prediction using the inference result. In the tutorial, the posterior mean of the parameters is used to form the prediction. Using Particles, we can instead form the prediction using the entire posterior distribution.

Like above, we hide the data generation under a collapsable section.

Generate fake data ```julia using Turing, Turing2MonteCarloMeasurements, Distributions, MonteCarloMeasurements coefficients = randn(5) x = randn(30, 5) y = x * coefficients .+ 1 .+ 0.4 .* randn.() sI = sortperm(y) y = y[sI] x = x[sI,:] ```

```julia @model linearregression(x, y, nobs, n_vars) = begin # Set variance prior. σ₂ ~ TruncatedNormal(0,100, 0, Inf)

# Set intercept prior.
intercept ~ Normal(0, 3)

# Set the priors on our coefficients.
coefficients = Array{Real}(undef, n_vars)
for i in 1:n_vars
    coefficients[i] ~ Normal(0, 10)
end

# Calculate all the mu terms.
mu = intercept .+ x * coefficients
for i = 1:n_obs
    y[i] ~ Normal(mu[i], σ₂)
end

end; nobs, nvars = size(x) model = linearregression(x, y, nobs, n_vars) chain = sample(model, NUTS(0.65), 2500); ```

In order to form the prediction, the original tutorial did julia function prediction(chain, x) p = get_params(chain[200:end, :, :]) α = mean(p.intercept) β = collect(mean.(p.coefficients)) return α .+ x * β end we will instead do julia cp = Particles(chain, crop=500) ŷ = x*cp.coefficients .+ cp.intercept plot(y, lab="data"); plot!(ŷ) window

julia bar(coefficients, lab="True coeffs", title="Coefficients") errorbarplot!(1:n_vars, cp.coefficients, seriestype=:bar, alpha=0.5) window

julia plot(plot.(cp.coefficients)..., legend=false) vline!(coefficients', l=(3,), lab="True value") window

Further documentation

MonteCarloMeasurements

stable latest arXiv article

Turing

Documentation

Owner

  • Name: Fredrik Bagge Carlson
  • Login: baggepinnen
  • Kind: user
  • Location: Lund, Sweden

Control systems, system identification, signal processing and machine learning

GitHub Events

Total
  • Push event: 1
Last Year
  • Push event: 1

Issues and Pull Requests

Last synced: 11 months ago

All Time
  • Total issues: 1
  • Total pull requests: 2
  • Average time to close issues: 2 days
  • Average time to close pull requests: about 8 hours
  • Total issue authors: 1
  • Total pull request authors: 2
  • Average comments per issue: 5.0
  • Average comments per pull request: 0.5
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • acwatt (1)
Pull Request Authors
  • baggepinnen (1)
  • JuliaTagBot (1)
Top Labels
Issue Labels
Pull Request Labels

Dependencies

.github/workflows/TagBot.yml actions
  • JuliaRegistries/TagBot v1 composite