hmm_cpp
Complete C++ implementation of Hidden Markov Models with modern C++17 and Eigen
Science Score: 54.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
✓Committers with academic emails
1 of 1 committers (100.0%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (7.5%) to scientific vocabulary
Keywords
Repository
Complete C++ implementation of Hidden Markov Models with modern C++17 and Eigen
Basic Info
Statistics
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
- Releases: 1
Topics
Metadata Files
README.md
hmm_cpp - Complete Hidden Markov Model Library
A complete C++ implementation of the Python hmmlearn library, featuring modern C++17, Eigen for linear algebra, and comprehensive HMM algorithms.
Overview
This library provides a complete implementation of Hidden Markov Models (HMMs) with various emission models:
- GaussianHMM: HMM with Gaussian emission distributions
- MultinomialHMM: HMM with discrete/multinomial emissions
- GMMHMM: HMM with Gaussian Mixture Model emissions
- GaussianMixture: Standalone Gaussian Mixture Models
Architecture
hmm_c++/
├── include/ # Header files
│ ├── types.hpp # Common type definitions
│ ├── hmm/ # HMM class headers
│ │ ├── base_hmm.hpp # Base HMM class
│ │ ├── gaussian_hmm.hpp # Gaussian HMM
│ │ ├── multinomial_hmm.hpp # Multinomial HMM
│ │ └── gmm_hmm.hpp # GMM HMM
│ ├── models/ # Model headers
│ │ └── gaussian_mixture.hpp # Gaussian Mixture Model
│ └── algorithms/ # Algorithm headers
│ ├── baum_welch.hpp # EM training algorithm
│ ├── viterbi.hpp # Viterbi algorithm
│ └── forward_backward.hpp # Forward-Backward algorithm
├── src/ # Source files
│ ├── hmm/ # HMM implementations
│ │ ├── base_hmm.cpp # Base HMM implementation
│ │ ├── gaussian_hmm.cpp # Gaussian HMM implementation
│ │ ├── multinomial_hmm.cpp # Multinomial HMM implementation
│ │ └── gmm_hmm.cpp # GMM HMM implementation
│ ├── models/ # Model implementations
│ │ └── gaussian_mixture.cpp # Gaussian Mixture implementation
│ └── algorithms/ # Algorithm implementations
│ ├── baum_welch.cpp # Baum-Welch implementation
│ ├── viterbi.cpp # Viterbi implementation
│ └── forward_backward.cpp # Forward-Backward implementation
├── examples/ # Usage examples
├── tests/ # Test files
└── CMakeLists.txt # CMake build configuration
Features
Core HMM Classes
- BaseHMM: Abstract base class with common HMM functionality
- GaussianHMM: HMM with multivariate Gaussian emissions
- MultinomialHMM: HMM with discrete emissions
- GMMHMM: HMM with Gaussian Mixture Model emissions
Algorithms
- Baum-Welch: Expectation-Maximization training algorithm
- Viterbi: Most likely hidden state sequence
- Forward-Backward: State probabilities and likelihood computation
Models
- GaussianMixture: Standalone Gaussian Mixture Models
- Support for different covariance types (Full, Diagonal, Spherical)
Modern C++ Features
- C++17 standard compliance
- Smart pointers for memory management
- RAII principles
- Exception safety
- Template metaprogramming
- STL containers and algorithms
Quick Start
Prerequisites
- C++17 compatible compiler (GCC 7+, Clang 5+, MSVC 2017+)
- Eigen3 library
- CMake 3.10+ (optional, for build system)
Installation
- Install Eigen3: ```bash # macOS brew install eigen
# Ubuntu/Debian sudo apt-get install libeigen3-dev
# Windows (vcpkg) vcpkg install eigen3 ```
Clone the repository:
bash git clone <repository-url> cd hmm_c++Build the library: ```bash
Using CMake (recommended)
mkdir build && cd build cmake .. make -j4
# Or compile manually g++ -std=c++17 -I/usr/local/include -O2 src/*.cpp -leigen3 -o hmm_test ```
Basic Usage
```cpp
include "include/hmm/gaussian_hmm.hpp"
include
using namespace hmmlearn_cpp;
int main() { // Create a Gaussian HMM with 3 states and 2 features GaussianHMM hmm(3, 2, CovarianceType::FULL);
// Generate or load your data
Matrix X(1000, 2); // Your data here
// Train the HMM
TrainingParams params;
params.max_iter = 100;
params.tol = 1e-4;
TrainingResult result = hmm.fit(X, {}, params);
// Make predictions
IntVector states = hmm.predict(X);
Matrix state_probs = hmm.predict_proba(X);
// Generate samples
Matrix samples = hmm.sample(100);
std::cout << "Training converged: " << result.converged << std::endl;
std::cout << "Final log-likelihood: " << result.final_log_likelihood << std::endl;
return 0;
} ```
API Reference
GaussianHMM
```cpp class GaussianHMM : public BaseHMM { public: // Constructor GaussianHMM(int ncomponents, int nfeatures, CovarianceType covariancetype = CovarianceType::FULL, unsigned int randomstate = 42);
// Training
TrainingResult fit(const Matrix& X, const std::vector<int>& lengths,
const TrainingParams& params);
// Prediction
IntVector predict(const Matrix& X) const;
Matrix predict_proba(const Matrix& X) const;
// Sampling
Matrix sample(int n_samples) const;
// Scoring
Scalar score(const Matrix& X) const;
// Parameter access
Vector get_startprob() const;
Matrix get_transmat() const;
Matrix get_means() const;
std::vector<Matrix> get_covariances() const;
}; ```
MultinomialHMM
```cpp class MultinomialHMM : public BaseHMM { public: // Constructor MultinomialHMM(int ncomponents, int nfeatures, unsigned int random_state = 42);
// Same interface as GaussianHMM
// Emission parameters are discrete probability distributions
}; ```
GMMHMM
```cpp class GMMHMM : public BaseHMM { public: // Constructor GMMHMM(int ncomponents, int nfeatures, int nmix, CovarianceType covariancetype = CovarianceType::FULL, unsigned int random_state = 42);
// Additional method to extract GMMs
std::vector<GaussianMixture> get_gmms() const;
}; ```
GaussianMixture
```cpp class GaussianMixture { public: // Constructor GaussianMixture(int ncomponents, int nfeatures, CovarianceType covariance_type = CovarianceType::FULL);
// Training
void fit(const Matrix& X);
// Prediction
Matrix predict_proba(const Matrix& X) const;
IntVector predict(const Matrix& X) const;
// Sampling
Matrix sample(int n_samples) const;
// Scoring
Scalar score(const Matrix& X) const;
Matrix score_samples(const Matrix& X) const;
}; ```
Configuration
Training Parameters
cpp
struct TrainingParams {
int max_iter = 100; // Maximum iterations
Scalar tol = 1e-4; // Convergence tolerance
bool verbose = false; // Verbose output
unsigned int random_state = 42; // Random seed
};
Covariance Types
cpp
enum class CovarianceType {
FULL, // Full covariance matrices
DIAGONAL, // Diagonal covariance matrices
SPHERICAL // Spherical covariance matrices
};
Testing
Run the comprehensive test suite:
```bash
Basic structure test (no Eigen required)
g++ -std=c++17 testbasicstructure.cpp -o testbasicstructure ./testbasicstructure
Complete library test (requires Eigen)
g++ -std=c++17 -I/usr/local/include testcompletelibrary.cpp src/*.cpp -leigen3 -o testcompletelibrary ./testcompletelibrary ```
Performance
The library is optimized for performance:
- Eigen Integration: Leverages Eigen's highly optimized linear algebra
- Memory Efficiency: Smart pointers and RAII for automatic memory management
- Algorithmic Optimizations: Efficient implementations of Baum-Welch, Viterbi, and Forward-Backward
- SIMD Support: Eigen provides SIMD optimizations where available
Safety Features
- Exception Safety: All operations are exception-safe
- Input Validation: Comprehensive parameter validation
- Memory Safety: Smart pointers prevent memory leaks
- Thread Safety: Const methods are thread-safe
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Inspired by the Python hmmlearn library
- Built with Eigen for efficient linear algebra
- Uses modern C++ best practices
Support
For questions, issues, or contributions, please open an issue on GitHub. You can drop a mail to duster.amigos05@gmail.com if needed.
Owner
- Login: duster-amigos
- Kind: user
- Repositories: 1
- Profile: https://github.com/duster-amigos
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "hmm_cpp"
given-names: "Duster Amigos"
orcid: ""
title: "hmm_cpp: A Complete C++ Implementation of Hidden Markov Models"
version: 1.1.0
doi:
date-released: 2025-07-12
url: "https://github.com/duster-amigos/hmm_cpp"
repository-code: "https://github.com/duster-amigos/hmm_cpp"
license: MIT
keywords:
- "hidden markov models"
- "machine learning"
- "c++"
- "eigen"
- "gaussian mixture models"
- "baum-welch"
- "viterbi"
- "forward-backward"
abstract: "hmm_cpp is a complete C++ implementation of Hidden Markov Models (HMMs) inspired by the Python hmmlearn library. The library provides efficient implementations of Gaussian HMMs, Multinomial HMMs, and GMM HMMs with support for training, inference, and sampling. Built with modern C++17 and Eigen for linear algebra, it offers high performance and cross-platform compatibility."
GitHub Events
Total
- Release event: 1
- Create event: 2
Last Year
- Release event: 1
- Create event: 2
Committers
Last synced: 7 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Shivansh | c****1@s****n | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 7 months ago