Recent Releases of hmm_cpp

hmm_cpp - hmm_cpp v1.1.0 - Complete HMM Library

hmm_cpp - Complete Hidden Markov Model Library

A complete C++ implementation of the Python hmmlearn library, featuring modern C++17, Eigen for linear algebra, and comprehensive HMM algorithms.

Overview

This library provides a complete implementation of Hidden Markov Models (HMMs) with various emission models:

  • GaussianHMM: HMM with Gaussian emission distributions
  • MultinomialHMM: HMM with discrete/multinomial emissions
  • GMMHMM: HMM with Gaussian Mixture Model emissions
  • GaussianMixture: Standalone Gaussian Mixture Models

Architecture

hmm_c++/ ├── include/ # Header files │ ├── types.hpp # Common type definitions │ ├── hmm/ # HMM class headers │ │ ├── base_hmm.hpp # Base HMM class │ │ ├── gaussian_hmm.hpp # Gaussian HMM │ │ ├── multinomial_hmm.hpp # Multinomial HMM │ │ └── gmm_hmm.hpp # GMM HMM │ ├── models/ # Model headers │ │ └── gaussian_mixture.hpp # Gaussian Mixture Model │ └── algorithms/ # Algorithm headers │ ├── baum_welch.hpp # EM training algorithm │ ├── viterbi.hpp # Viterbi algorithm │ └── forward_backward.hpp # Forward-Backward algorithm ├── src/ # Source files │ ├── hmm/ # HMM implementations │ │ ├── base_hmm.cpp # Base HMM implementation │ │ ├── gaussian_hmm.cpp # Gaussian HMM implementation │ │ ├── multinomial_hmm.cpp # Multinomial HMM implementation │ │ └── gmm_hmm.cpp # GMM HMM implementation │ ├── models/ # Model implementations │ │ └── gaussian_mixture.cpp # Gaussian Mixture implementation │ └── algorithms/ # Algorithm implementations │ ├── baum_welch.cpp # Baum-Welch implementation │ ├── viterbi.cpp # Viterbi implementation │ └── forward_backward.cpp # Forward-Backward implementation ├── examples/ # Usage examples ├── tests/ # Test files └── CMakeLists.txt # CMake build configuration

Features

Core HMM Classes

  • BaseHMM: Abstract base class with common HMM functionality
  • GaussianHMM: HMM with multivariate Gaussian emissions
  • MultinomialHMM: HMM with discrete emissions
  • GMMHMM: HMM with Gaussian Mixture Model emissions

Algorithms

  • Baum-Welch: Expectation-Maximization training algorithm
  • Viterbi: Most likely hidden state sequence
  • Forward-Backward: State probabilities and likelihood computation

Models

  • GaussianMixture: Standalone Gaussian Mixture Models
  • Support for different covariance types (Full, Diagonal, Spherical)

Modern C++ Features

  • C++17 standard compliance
  • Smart pointers for memory management
  • RAII principles
  • Exception safety
  • Template metaprogramming
  • STL containers and algorithms

Quick Start

Prerequisites

  • C++17 compatible compiler (GCC 7+, Clang 5+, MSVC 2017+)
  • Eigen3 library
  • CMake 3.10+ (optional, for build system)

Installation

  1. Install Eigen3: ```bash # macOS brew install eigen

# Ubuntu/Debian sudo apt-get install libeigen3-dev

# Windows (vcpkg) vcpkg install eigen3 ```

  1. Clone the repository: bash git clone <repository-url> cd hmm_c++

  2. Build the library: ```bash

    Using CMake (recommended)

    mkdir build && cd build cmake .. make -j4

# Or compile manually g++ -std=c++17 -I/usr/local/include -O2 src/*.cpp -leigen3 -o hmm_test ```

Basic Usage

```cpp

include "include/hmm/gaussian_hmm.hpp"

include

using namespace hmmlearn_cpp;

int main() { // Create a Gaussian HMM with 3 states and 2 features GaussianHMM hmm(3, 2, CovarianceType::FULL);

// Generate or load your data
Matrix X(1000, 2); // Your data here

// Train the HMM
TrainingParams params;
params.max_iter = 100;
params.tol = 1e-4;

TrainingResult result = hmm.fit(X, {}, params);

// Make predictions
IntVector states = hmm.predict(X);
Matrix state_probs = hmm.predict_proba(X);

// Generate samples
Matrix samples = hmm.sample(100);

std::cout << "Training converged: " << result.converged << std::endl;
std::cout << "Final log-likelihood: " << result.final_log_likelihood << std::endl;

return 0;

} ```

API Reference

GaussianHMM

```cpp class GaussianHMM : public BaseHMM { public: // Constructor GaussianHMM(int ncomponents, int nfeatures, CovarianceType covariancetype = CovarianceType::FULL, unsigned int randomstate = 42);

// Training
TrainingResult fit(const Matrix& X, const std::vector<int>& lengths, 
                  const TrainingParams& params);

// Prediction
IntVector predict(const Matrix& X) const;
Matrix predict_proba(const Matrix& X) const;

// Sampling
Matrix sample(int n_samples) const;

// Scoring
Scalar score(const Matrix& X) const;

// Parameter access
Vector get_startprob() const;
Matrix get_transmat() const;
Matrix get_means() const;
std::vector<Matrix> get_covariances() const;

}; ```

MultinomialHMM

```cpp class MultinomialHMM : public BaseHMM { public: // Constructor MultinomialHMM(int ncomponents, int nfeatures, unsigned int random_state = 42);

// Same interface as GaussianHMM
// Emission parameters are discrete probability distributions

}; ```

GMMHMM

```cpp class GMMHMM : public BaseHMM { public: // Constructor GMMHMM(int ncomponents, int nfeatures, int nmix, CovarianceType covariancetype = CovarianceType::FULL, unsigned int random_state = 42);

// Additional method to extract GMMs
std::vector<GaussianMixture> get_gmms() const;

}; ```

GaussianMixture

```cpp class GaussianMixture { public: // Constructor GaussianMixture(int ncomponents, int nfeatures, CovarianceType covariance_type = CovarianceType::FULL);

// Training
void fit(const Matrix& X);

// Prediction
Matrix predict_proba(const Matrix& X) const;
IntVector predict(const Matrix& X) const;

// Sampling
Matrix sample(int n_samples) const;

// Scoring
Scalar score(const Matrix& X) const;
Matrix score_samples(const Matrix& X) const;

}; ```

Configuration

Training Parameters

cpp struct TrainingParams { int max_iter = 100; // Maximum iterations Scalar tol = 1e-4; // Convergence tolerance bool verbose = false; // Verbose output unsigned int random_state = 42; // Random seed };

Covariance Types

cpp enum class CovarianceType { FULL, // Full covariance matrices DIAGONAL, // Diagonal covariance matrices SPHERICAL // Spherical covariance matrices };

Testing

Run the comprehensive test suite:

```bash

Basic structure test (no Eigen required)

g++ -std=c++17 testbasicstructure.cpp -o testbasicstructure ./testbasicstructure

Complete library test (requires Eigen)

g++ -std=c++17 -I/usr/local/include testcompletelibrary.cpp src/*.cpp -leigen3 -o testcompletelibrary ./testcompletelibrary ```

Performance

The library is optimized for performance:

  • Eigen Integration: Leverages Eigen's highly optimized linear algebra
  • Memory Efficiency: Smart pointers and RAII for automatic memory management
  • Algorithmic Optimizations: Efficient implementations of Baum-Welch, Viterbi, and Forward-Backward
  • SIMD Support: Eigen provides SIMD optimizations where available

Safety Features

  • Exception Safety: All operations are exception-safe
  • Input Validation: Comprehensive parameter validation
  • Memory Safety: Smart pointers prevent memory leaks
  • Thread Safety: Const methods are thread-safe

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass
  6. Submit a pull request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Inspired by the Python hmmlearn library
  • Built with Eigen for efficient linear algebra
  • Uses modern C++ best practices

Support

For questions, issues, or contributions, please open an issue on GitHub. You can drop a mail to duster.amigos05@gmail.com if needed.


- C++
Published by duster-amigos 8 months ago