https://github.com/dptam/pytorchdiscreteflows

Discrete Normalizing Flows implemented in PyTorch

https://github.com/dptam/pytorchdiscreteflows

Science Score: 10.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (9.2%) to scientific vocabulary
Last synced: 5 months ago · JSON representation

Repository

Discrete Normalizing Flows implemented in PyTorch

Basic Info
  • Host: GitHub
  • Owner: dptam
  • Language: Jupyter Notebook
  • Default Branch: master
  • Size: 2.89 MB
Statistics
  • Stars: 0
  • Watchers: 0
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Fork of TrentBrick/PyTorchDiscreteFlows
Created about 5 years ago · Last pushed about 5 years ago

https://github.com/dptam/PyTorchDiscreteFlows/blob/master/

## Acknowledgements

The discrete normalizing flow code is originally taken and modified from:
https://github.com/google/edward2/blob/master/edward2/tensorflow/layers/discrete_flows.py
and https://github.com/google/edward2/blob/master/edward2/tensorflow/layers/utils.py
Which was introduced in the paper: https://arxiv.org/abs/1905.10347 

The demo file, MADE, and MLP were modified and taken from: https://github.com/karpathy/pytorch-normalizing-flows

## State of Library

To my knowledge as of July 3rd 2020, this is the only functional demo of discrete normalizing flows in PyTorch. The code in edward2 (implemented in TF2 and Keras, lacks any tutorials. Since the release of this repo and because of correspondence with the authors of the original paper, demo code for reproducing Figure 2 using Edward2 has been shared [here](https://github.com/google/edward2/blob/a0f683ffc549add74d82405bc81073b7162cd408/examples/quantized_ring_of_gaussians.py).

With collaboration from [Yashas Annadani](https://github.com/yannadani) and Jan Francu, I have been able to reproduce the paper's Figure 2 discretized mixture of Gaussians with this library.

## Use Library

To use this package, clone the repo satisfy the below package requirements, then run Figure2Replication.ipynb.

Requirements:
Python 3.0+
PyTorch 1.2.0+
Numpy 1.17.2+

## Implementation details
NB. Going from Andre Karpathy's notation, flow.reverse() goes from the latent space to the data and flow.forward() goes from the data to the latent space. This is the inverse of some other implementations including the original Tensorflow one.
Implements Bipartite and Autoregressive discrete normalizing flows. Also has an implementation of MADE and a simple MLP.

## TODOs - Pull requests very welcome!
* Write testing script to ensure all of the models are indeed invertible. 
* Reproduce the remanining figures/results from the original paper starting with the Potts models.
* Implement the Sinkhorn autoregressive flow: https://github.com/google/edward2/blob/master/edward2/tensorflow/layers/discrete_flows.py#L373
* Add non block splitting for bipartite.
* Ensure that the scaling functionality works (this should not matter for being able to reproduce the first few figures).

## Replication of Figure 2 Mixture of Gaussians

Figure 2 in the [paper](https://arxiv.org/abs/1905.10347) looks like this:

![PaperFigure2](figures/Figure2FromPaper.png)

This library's replication is:

![Fig2Reproduction](figures/Fig2Reproduce.png)

Owner

  • Name: Derek Tam
  • Login: dptam
  • Kind: user

GitHub Events

Total
Last Year