iohmm

Input Output Hidden Markov Model (IOHMM) in Python

https://github.com/mogeng/iohmm

Science Score: 46.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: ieee.org
  • Committers with academic emails
    3 of 6 committers (50.0%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.0%) to scientific vocabulary

Keywords

graphical-models hidden-markov-model linear-models machine-learning python scikit-learn semi-supervised-learning sequence-labeling sequence-to-sequence statsmodels supervised-learning time-series unsupervised-learning
Last synced: 6 months ago · JSON representation

Repository

Input Output Hidden Markov Model (IOHMM) in Python

Basic Info
  • Host: GitHub
  • Owner: Mogeng
  • License: bsd-3-clause
  • Language: Python
  • Default Branch: master
  • Homepage:
  • Size: 1.03 MB
Statistics
  • Stars: 168
  • Watchers: 14
  • Forks: 36
  • Open Issues: 11
  • Releases: 1
Topics
graphical-models hidden-markov-model linear-models machine-learning python scikit-learn semi-supervised-learning sequence-labeling sequence-to-sequence statsmodels supervised-learning time-series unsupervised-learning
Created about 10 years ago · Last pushed over 1 year ago
Metadata Files
Readme

README.md

IOHMM

A Python package of Input-Output Hidden Markov Model (IOHMM).

Build Status Coverage Status

IOHMM extends standard HMM by allowing (a) initial, (b) transition and (c) emission probabilities to depend on various covariates. A graphical representation of standard HMM and IOHMM:

| Standard HMM | IOHMM | | --- | --- | | | |

The solid nodes represent observed information, while the transparent (white) nodes represent latent random variables. The top layer contains the observed input variables ut; the middle layer contains latent categorical variable zt; and the bottom layer contains observed output variables xt. The input for (a) initial, (b) transition and (c) emission probabilities does not have to be the same.

For more theoretical details: * An Input Output HMM Architecture * Input-output HMMs for sequence processing

Applications of IOHMM: * A Generative Model of Urban Activities from Cellular Data

Installing

shell pip install IOHMM

Examples

The example directory contains a set of Jupyter Notebook of examples and demonstrations of:

Features

  • 3-in-1 IOHMM. IOHMM package supports:

    • UnSupervised IOHMM when you have no ground truth hidden states at any timestamp. Expectation-Maximization (EM) algorithm will be used to estimate parameters (maximization step) and posteriors (expectation step).
    • SemiSupervised IOHMM when you have certain amount of ground truth hidden states and would like to enforce these labeled hidden states during learning and use these labels to help direct the learning process.
    • Supervised IOHMM when you want to purely use labeled ground truth hidden states during the learning. There will be no expectation step and only one shot of maximization step during the learning since all the posteriors are from labeled ground truth.
  • Crystal clear structure. Know each step you go:

    • All sequences are represented by pandas dataframes. It has great interface to load csv, json, etc. files or to pull data from sql databases. It is also easy to visualize.
    • Inputs and outputs covariates are specified by the column names (strings) in the dataframes.
    • You can pass a list of sequences (dataframes) as data -- there is no more need to tag the start of each sequence in a single stacked sequence.
    • You can specify different set of inputs for (a) initial, (b) transition and (c) different emission models.
  • Forward Backward algorithm. Faster and more robust:

    • Fully vectorized. Only one 'for loop' (due to dynamic programming) in the forward/backward pass where most current implementations have more than one 'for loop'.
    • All calculations are at log level, this is more robust to long sequence for which the probabilities easily vanish to 0.
  • Json-serialization. Models on the go:

    • Save (to_json) and load (from_json) a trained model in json format. All the attributes are easily visualizable in the json dictionary/file. See Jupyter Notebook of examples for more details.
    • Use a json configuration file to specify the structure of an IOHMM model (from_config). This is useful when you have an application that uses IOHMM models and would like to specify the model before hand.
  • Statsmodels and scikit-learn as the backbone. Take the best of both and better:

    • Unified interface/wrapper to statsmodels and scikit-learn linear models/generalized linear models.
    • Supports fitting the model with sample frequency weights.
    • Supports regularizations in these models.
    • Supports estimation of standard error of the coefficients in certain models.
    • Json-serialization to save (to_json) and load (from_json) of trained linear models.

Credits

Licensing

Modified BSD (3-clause)

Owner

  • Login: Mogeng
  • Kind: user

GitHub Events

Total
  • Watch event: 8
  • Fork event: 1
Last Year
  • Watch event: 8
  • Fork event: 1

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 74
  • Total Committers: 6
  • Avg Commits per committer: 12.333
  • Development Distribution Score (DDS): 0.365
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Mogeng m****n@b****u 47
Mogeng Yin m****n@M****l 21
Mogeng Yin m****n@u****u 3
Thuener Silva t****r@g****m 1
Eric Denovellis e****o 1
Mogeng Yin m****n@u****u 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 8 months ago

All Time
  • Total issues: 18
  • Total pull requests: 24
  • Average time to close issues: 6 months
  • Average time to close pull requests: 12 days
  • Total issue authors: 15
  • Total pull request authors: 4
  • Average comments per issue: 1.28
  • Average comments per pull request: 0.38
  • Merged pull requests: 23
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • antonionieto (2)
  • aminzadenoori (2)
  • Thuener (2)
  • ucabqll (1)
  • WillSuen (1)
  • chenliangpeng (1)
  • NicholasLee76 (1)
  • mackancurtaincheeks (1)
  • Jasmine1004 (1)
  • ammar-n-abbas (1)
  • sonam0125 (1)
  • SlvLdR (1)
  • JoepvanderPlas (1)
  • Singh-Sonam (1)
Pull Request Authors
  • Mogeng (21)
  • edeno (1)
  • cruyffturn (1)
  • Thuener (1)
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 110 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 1
  • Total versions: 5
  • Total maintainers: 1
pypi.org: iohmm

A python library for Input Output Hidden Markov Models

  • Versions: 5
  • Dependent Packages: 0
  • Dependent Repositories: 1
  • Downloads: 110 Last month
Rankings
Stargazers count: 5.9%
Forks count: 6.7%
Dependent packages count: 10.0%
Average: 12.8%
Downloads: 19.9%
Dependent repos count: 21.7%
Maintainers (1)
Last synced: 6 months ago

Dependencies

requirements.txt pypi
  • future >=0.18.2
  • numpy >=1.20.0
  • pandas >=1.2.1
  • scikit-learn >=0.24.1
  • scipy >=1.6.0
  • statsmodels >=0.12.2
setup.py pypi
  • future *
  • numpy *
  • pandas *
  • scikit-learn *
  • scipy *
  • statsmodels *