https://github.com/cloneofsimo/consistency_models

Unofficial Implementation of Consistency Models in pytorch

https://github.com/cloneofsimo/consistency_models

Science Score: 36.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
  • DOI references
    Found 6 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (10.3%) to scientific vocabulary

Keywords

generative-model pytorch
Last synced: 5 months ago · JSON representation

Repository

Unofficial Implementation of Consistency Models in pytorch

Basic Info
Statistics
  • Stars: 254
  • Watchers: 14
  • Forks: 11
  • Open Issues: 3
  • Releases: 0
Topics
generative-model pytorch
Created almost 3 years ago · Last pushed almost 3 years ago
Metadata Files
Readme

README.md

Consistency Models

30 Epoch, Consistency Model with 2 step. Using $t1 = 2, t2 = 80$.

30 Epoch, Consistency Model with 5 step. Using $t_i \in {5, 10, 20,40, 80}$.

Unofficial Implementation of Consistency Models (paper) in pytorch.

Three days ago, legendary man Yang Song released entirely new set of generative model, called consistency models. There aren't yet any open implementations, so here is my attempt at it.

What are they?

Diffusion models are amazing, because they enable you to sample high fidelity + high diversity images. Downside is, you need lots of steps, something at least 20.

Progressive Distillation (Salimans & Ho, 2022) solves this with distillating 2-steps of the diffusion model down to single step. Doing this N times boosts sampling speed by $2^N$. But is this the only way? Do we need to train diffusion model and distill it $n$ times? Yang didn't think so. Consistency model solves this by mainly trianing a model to make a consistent denosing for different timesteps (Ok I'm obviously simplifying)

Usage

Install the package with

bash pip install git+https://github.com/cloneofsimo/consistency_models.git

This repo mainly implements consistency training:

$$ L(\theta) = \mathbb{E}[d(f\theta(x + t{n + 1}z, t{n + 1}), f{\theta{-}}(x + tn z, t_n))] $$

And sampling:

$$ \begin{align} z &\sim \mathcal{N}(0, I) \ x &\leftarrow x + \sqrt{tn ^2 - \epsilon^2} z \ x &\leftarrow f\theta(x, t_n) \ \end{align} $$

There is a self-contained MNIST training example on the root main.py.

bash python main.py

Todo

  • [x] EMA
  • [x] CIFAR10 Example
  • [x] Samples are sooo fuzzy... try to get a crisp result.
  • [ ] Consistency Distillation

Reference

```bibtex @misc{https://doi.org/10.48550/arxiv.2303.01469, doi = {10.48550/ARXIV.2303.01469},

url = {https://arxiv.org/abs/2303.01469},

author = {Song, Yang and Dhariwal, Prafulla and Chen, Mark and Sutskever, Ilya},

keywords = {Machine Learning (cs.LG), Computer Vision and Pattern Recognition (cs.CV), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences},

title = {Consistency Models},

publisher = {arXiv},

year = {2023},

copyright = {arXiv.org perpetual, non-exclusive license} } ```

```bibtex @misc{https://doi.org/10.48550/arxiv.2202.00512, doi = {10.48550/ARXIV.2202.00512},

url = {https://arxiv.org/abs/2202.00512},

author = {Salimans, Tim and Ho, Jonathan},

keywords = {Machine Learning (cs.LG), Artificial Intelligence (cs.AI), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences},

title = {Progressive Distillation for Fast Sampling of Diffusion Models},

publisher = {arXiv},

year = {2022},

copyright = {arXiv.org perpetual, non-exclusive license} } ```

Owner

  • Name: Simo Ryu
  • Login: cloneofsimo
  • Kind: user
  • Company: Corca AI

Cats are Turing machines cloneofsimo@gmail.com

GitHub Events

Total
  • Watch event: 14
Last Year
  • Watch event: 14

Issues and Pull Requests

Last synced: 10 months ago

All Time
  • Total issues: 5
  • Total pull requests: 1
  • Average time to close issues: 5 months
  • Average time to close pull requests: 1 day
  • Total issue authors: 5
  • Total pull request authors: 1
  • Average comments per issue: 1.0
  • Average comments per pull request: 4.0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • JunyaoHu (1)
  • kimihailv (1)
  • cantabile-kwok (1)
  • Njasa2k (1)
  • discordance (1)
Pull Request Authors
  • tcapelle (1)
Top Labels
Issue Labels
Pull Request Labels