ExpectationMaximization
A simple but generic implementation of Expectation Maximization algorithms to fit mixture models.
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.9%) to scientific vocabulary
Keywords
Keywords from Contributors
Repository
A simple but generic implementation of Expectation Maximization algorithms to fit mixture models.
Basic Info
- Host: GitHub
- Owner: dmetivie
- License: mit
- Language: Julia
- Default Branch: master
- Homepage: https://dmetivie.github.io/ExpectationMaximization.jl/
- Size: 2.02 MB
Statistics
- Stars: 33
- Watchers: 2
- Forks: 1
- Open Issues: 5
- Releases: 8
Topics
Metadata Files
README.md
ExpectationMaximization
This package provides a simple implementation of the Expectation Maximization (EM) algorithm used to fit mixture models. Due to Julia's amazing dispatch system, generic and reusable code spirit, and the Distributions.jl package, the code while being very generic is both very expressive and fast! Take a look at the Benchmark section.
What type of mixtures?
In particular, it works on a lot of mixtures:
- Mixture of Univariate continuous distributions
- Mixture of Univariate discrete distributions
- Mixture of Multivariate distributions (continuous or discrete)
- Mixture of mixtures (univariate or multivariate and continuous or discrete)
- User defined mixtures (e.g. custom distributions)
- More?
What EM algorithm?
So far, the classic EM algorithm and the Stochastic EM are implemented. Look at the Bibliography section for references.
How?
Just define a mix::MixtureModel and do fit_mle(mix, y) where y is your observation array (vector or matrix). That's it! For Stochastic EM, just do fit_mle(mix, y, method = StochasticEM()).
Take a look at the Examples section.
To work, the only requirements are that the components of the mixture dist ∈ dists = components(mix) considered (custom or coming from an existing package)
- Are a subtype of
Distributioni.e.dist<:Distribution. - The
logpdf(dist, y)is defined (it is used in the E-step) - The
fit_mle(dist, y, weights)returns the distribution with the updated parameters maximizing the likelihood. This is used in the M-step of theClassicalEMalgorithm. For theStochasticEMversion, onlyfit_mle(dist, y)is needed. Type or instance version offit_mlefor yourdistare accepted thanks to this conversion line.
TODO (feel free to contribute)
Add more variants of the EM algorithm (so far there are the classic and stochastic version).
Better benchmark against other EM
Improve robustness against edge cases (e.g. empty data, empty mixture, etc.). See PR #12 (if you have suggestions, I am still undecided about the best way to handle this).
Add advice and better default for
atolandrtolchoice (it is not obvious how to select them).Speed up code (always!). So far, I focused on readable code.
Connecting
ExpectationMaximization.jltoMLJ.jlandMLJModels.jlin the clustering algorithm section.Do a proper software paper.
Citation
If you use this package, please cite it with the following biblatex code:
@software{EM.jl-HAL,
Author = {David Métivier},
Title = {ExpectationMaximization.jl: A simple but generic implementation of Expectation Maximization algorithms to fit mixture models},
Doi = {hal-04784091},
Url = {https://hal.inrae.fr/hal-04784091},
Copyright = {MIT License}
}
For now, it is only on the HAL open archive (that my institute wants me to use) and is linked to a Software Heritage ID SWHID.
Basic usage
Also take a look at the examples section.
julia
using Distributions
using ExpectationMaximization
Model
```julia N = 50_000 θ₁ = 10 θ₂ = 5 α = 0.2 β = 0.3
Mixture Model here one can put any classical distributions
mix_true = MixtureModel([Exponential(θ₁), Gamma(α, θ₂)], [β, 1 - β])
Generate N samples from the mixture
y = rand(mix_true, N) ```
Inference
```julia
Initial guess
mix_guess = MixtureModel([Exponential(1), Gamma(0.5, 1)], [0.5, 1 - 0.5])
Fit the MLE with the EM algorithm
mixmle = fitmle(mix_guess, y; display = :iter, atol = 1e-3, robust = false, infos = false) ```
Verify results
julia
rtol = 5e-2
p = params(mix_mle)[1] # (θ₁, (α, θ₂))
isapprox(β, probs(mix_mle)[1]; rtol = rtol)
isapprox(θ₁, p[1]...; rtol = rtol)
isapprox(α, p[2][1]; rtol = rtol)
isapprox(θ₂, p[2][2]; rtol = rtol)
Owner
- Name: David Métivier
- Login: dmetivie
- Kind: user
- Location: Montpellier, France
- Company: INRAe, MISTEA
- Website: http://www.cmap.polytechnique.fr/~david.metivier/
- Repositories: 5
- Profile: https://github.com/dmetivie
I am a research scientist with a physics background. Now, I do statistics to tackle environmental, and climate change problems. Julia enthusiast!
Citation (CITATION.bib)
@software{EM.jl-HAL,
Author = {David Métivier},
Title = {ExpectationMaximization.jl: A simple but generic implementation of Expectation Maximization algorithms to fit mixture models},
Doi = {hal-04784091},
Url = {https://hal.inrae.fr/hal-04784091},
Copyright = {MIT License}
}
GitHub Events
Total
- Create event: 3
- Commit comment event: 4
- Release event: 1
- Watch event: 1
- Delete event: 1
- Issue comment event: 4
- Push event: 26
- Pull request event: 4
- Fork event: 1
Last Year
- Create event: 3
- Commit comment event: 4
- Release event: 1
- Watch event: 1
- Delete event: 1
- Issue comment event: 4
- Push event: 26
- Pull request event: 4
- Fork event: 1
Committers
Last synced: over 1 year ago
Top Committers
| Name | Commits | |
|---|---|---|
| David Métivier | 4****e | 87 |
| CompatHelper Julia | c****y@j****g | 5 |
| Tim Holy | t****y@g****m | 2 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 10 months ago
All Time
- Total issues: 6
- Total pull requests: 9
- Average time to close issues: 6 months
- Average time to close pull requests: 3 days
- Total issue authors: 6
- Total pull request authors: 4
- Average comments per issue: 7.17
- Average comments per pull request: 0.89
- Merged pull requests: 8
- Bot issues: 0
- Bot pull requests: 5
Past Year
- Issues: 0
- Pull requests: 2
- Average time to close issues: N/A
- Average time to close pull requests: 6 days
- Issue authors: 0
- Pull request authors: 2
- Average comments per issue: 0
- Average comments per pull request: 1.0
- Merged pull requests: 2
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- dmetivie (1)
- lrnv (1)
- jonasmac16 (1)
- danielinteractive (1)
- JuliaTagBot (1)
- timholy (1)
Pull Request Authors
- github-actions[bot] (5)
- dmetivie (2)
- timholy (2)
- abhro (2)
- dependabot[bot] (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- julia 9 total
- Total dependent packages: 1
- Total dependent repositories: 0
- Total versions: 12
juliahub.com: ExpectationMaximization
A simple but generic implementation of Expectation Maximization algorithms to fit mixture models.
- Homepage: https://dmetivie.github.io/ExpectationMaximization.jl/
- Documentation: https://docs.juliahub.com/General/ExpectationMaximization/stable/
- License: MIT
-
Latest release: 0.2.4
published 8 months ago
Rankings
Dependencies
- actions/cache v1 composite
- actions/checkout v2 composite
- codecov/codecov-action v1 composite
- julia-actions/julia-buildpkg v1 composite
- julia-actions/julia-processcoverage v1 composite
- julia-actions/julia-runtest v1 composite
- julia-actions/setup-julia v1 composite
- JuliaRegistries/TagBot v1 composite
- actions/checkout v2 composite
- codecov/codecov-action v1 composite
- julia-actions/julia-processcoverage v1 composite
- julia-actions/setup-julia latest composite