Science Score: 46.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: ieee.org -
✓Committers with academic emails
3 of 6 committers (50.0%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (13.0%) to scientific vocabulary
Keywords
Repository
Input Output Hidden Markov Model (IOHMM) in Python
Basic Info
Statistics
- Stars: 168
- Watchers: 14
- Forks: 36
- Open Issues: 11
- Releases: 1
Topics
Metadata Files
README.md
IOHMM
A Python package of Input-Output Hidden Markov Model (IOHMM).
IOHMM extends standard HMM by allowing (a) initial, (b) transition and (c) emission probabilities to depend on various covariates. A graphical representation of standard HMM and IOHMM:
| Standard HMM | IOHMM |
| --- | --- |
|
|
|
The solid nodes represent observed information, while the transparent (white) nodes represent latent random variables. The top layer contains the observed input variables ut; the middle layer contains latent categorical variable zt; and the bottom layer contains observed output variables xt. The input for (a) initial, (b) transition and (c) emission probabilities does not have to be the same.
For more theoretical details: * An Input Output HMM Architecture * Input-output HMMs for sequence processing
Applications of IOHMM: * A Generative Model of Urban Activities from Cellular Data
Installing
shell
pip install IOHMM
Examples
The example directory contains a set of Jupyter Notebook of examples and demonstrations of:
Features
3-in-1 IOHMM. IOHMM package supports:
- UnSupervised IOHMM when you have no ground truth hidden states at any timestamp. Expectation-Maximization (EM) algorithm will be used to estimate parameters (maximization step) and posteriors (expectation step).
- SemiSupervised IOHMM when you have certain amount of ground truth hidden states and would like to enforce these labeled hidden states during learning and use these labels to help direct the learning process.
- Supervised IOHMM when you want to purely use labeled ground truth hidden states during the learning. There will be no expectation step and only one shot of maximization step during the learning since all the posteriors are from labeled ground truth.
Crystal clear structure. Know each step you go:
- All sequences are represented by pandas dataframes. It has great interface to load csv, json, etc. files or to pull data from sql databases. It is also easy to visualize.
- Inputs and outputs covariates are specified by the column names (strings) in the dataframes.
- You can pass a list of sequences (dataframes) as data -- there is no more need to tag the start of each sequence in a single stacked sequence.
- You can specify different set of inputs for (a) initial, (b) transition and (c) different emission models.
Forward Backward algorithm. Faster and more robust:
- Fully vectorized. Only one 'for loop' (due to dynamic programming) in the forward/backward pass where most current implementations have more than one 'for loop'.
- All calculations are at log level, this is more robust to long sequence for which the probabilities easily vanish to 0.
Json-serialization. Models on the go:
- Save (
to_json) and load (from_json) a trained model in json format. All the attributes are easily visualizable in the json dictionary/file. See Jupyter Notebook of examples for more details. - Use a json configuration file to specify the structure of an IOHMM model (
from_config). This is useful when you have an application that uses IOHMM models and would like to specify the model before hand.
- Save (
Statsmodels and scikit-learn as the backbone. Take the best of both and better:
- Unified interface/wrapper to statsmodels and scikit-learn linear models/generalized linear models.
- Supports fitting the model with sample frequency weights.
- Supports regularizations in these models.
- Supports estimation of standard error of the coefficients in certain models.
- Json-serialization to save (
to_json) and load (from_json) of trained linear models.
Credits
- The structure of this implementation is inspired by depmixS4: depmixS4: An R Package for Hidden Markov Models.
- This IOHMM package uses/wraps statsmodels and scikit-learn APIs for linear supervised models.
Licensing
Modified BSD (3-clause)
Owner
- Login: Mogeng
- Kind: user
- Repositories: 4
- Profile: https://github.com/Mogeng
GitHub Events
Total
- Watch event: 8
- Fork event: 1
Last Year
- Watch event: 8
- Fork event: 1
Committers
Last synced: 9 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Mogeng | m****n@b****u | 47 |
| Mogeng Yin | m****n@M****l | 21 |
| Mogeng Yin | m****n@u****u | 3 |
| Thuener Silva | t****r@g****m | 1 |
| Eric Denovellis | e****o | 1 |
| Mogeng Yin | m****n@u****u | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 8 months ago
All Time
- Total issues: 18
- Total pull requests: 24
- Average time to close issues: 6 months
- Average time to close pull requests: 12 days
- Total issue authors: 15
- Total pull request authors: 4
- Average comments per issue: 1.28
- Average comments per pull request: 0.38
- Merged pull requests: 23
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- antonionieto (2)
- aminzadenoori (2)
- Thuener (2)
- ucabqll (1)
- WillSuen (1)
- chenliangpeng (1)
- NicholasLee76 (1)
- mackancurtaincheeks (1)
- Jasmine1004 (1)
- ammar-n-abbas (1)
- sonam0125 (1)
- SlvLdR (1)
- JoepvanderPlas (1)
- Singh-Sonam (1)
Pull Request Authors
- Mogeng (21)
- edeno (1)
- cruyffturn (1)
- Thuener (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 110 last-month
- Total dependent packages: 0
- Total dependent repositories: 1
- Total versions: 5
- Total maintainers: 1
pypi.org: iohmm
A python library for Input Output Hidden Markov Models
- Homepage: https://github.com/Mogeng/IOHMM
- Documentation: https://iohmm.readthedocs.io/
- License: BSD License
-
Latest release: 0.0.7
published almost 3 years ago
Rankings
Maintainers (1)
Dependencies
- future >=0.18.2
- numpy >=1.20.0
- pandas >=1.2.1
- scikit-learn >=0.24.1
- scipy >=1.6.0
- statsmodels >=0.12.2
- future *
- numpy *
- pandas *
- scikit-learn *
- scipy *
- statsmodels *