full-cov-mdn

Mixture Density Network in PyTorch with full covariance support.

https://github.com/haimengzhao/full-cov-mdn

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: zenodo.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (10.2%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Mixture Density Network in PyTorch with full covariance support.

Basic Info
  • Host: GitHub
  • Owner: haimengzhao
  • License: mit
  • Language: Jupyter Notebook
  • Default Branch: main
  • Size: 29.3 KB
Statistics
  • Stars: 4
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 1
Created almost 4 years ago · Last pushed almost 4 years ago
Metadata Files
Readme License Citation

README.md

Mixture Density Network in PyTorch with Full Covariance

DOI

Implementation of Mixture Density Network in PyTorch with full covariance matrix support.

The full covariance matrix is implemented via Cholesky decomposition with torch.distributions.MultivariateNormal. See this document for details.

Citation

If you find this repository useful, please cite us using the citation button in the right column provided by GitHub.

Usage

```python import torch from mdn import MixtureDensityNetwork

x = torch.randn(5, 1) data = torch.randn(5, 2)

1D input, 2D output, 3 mixture components

model = MixtureDensityNetwork( dimin=1, dimout=2, ncomponents=2, fullcov=True # whether to use a full covariance, # default full_cov=True )

returns predicted pi and normal distributions

pi, normal = model(x)

compute negative log likelihood

as loss function for back prop

loss = model.loss(x, y).mean()

use this to sample a trained model

samples = model.sample(x) ```

Example

See example.ipynb for training a 2 component full covariance MDN with the following data:

x\sim\text{Uniform}(0, 1)

and

\mathbb{R}^2\ni\text{data}\sim\text{the following figure}

Data

Note that an MDN with 2 diagonal covariance components can never recover such data.

Reference

The code structure follows this repo, which only supports diagonal covariances.

Owner

  • Name: Haimeng Zhao
  • Login: haimengzhao
  • Kind: user
  • Location: Beijing & Shanghai, China
  • Company: Tsinghua University

Undergraduate interested in Quantum Info and AI+Physics

Citation (CITATION.cff)

# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!

cff-version: 1.2.0
title: >-
  Mixture Density Network in PyTorch with Full
  Covariance
message: >-
  If you use this software, please cite it using the
  metadata from this file.
type: software
authors:
  - given-names: Haimeng
    family-names: Zhao
    email: haimengzhao@icloud.com
    affiliation: Tsinghua University
    orcid: 'https://orcid.org/0000-0001-6675-1489'
identifiers:
  - type: doi
    value: 10.5281/zenodo.6472171

GitHub Events

Total
  • Watch event: 1
Last Year
  • Watch event: 1