evidential-deep-learning
Learn fast, scalable, and calibrated measures of uncertainty using neural networks!
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (15.0%) to scientific vocabulary
Keywords
Repository
Learn fast, scalable, and calibrated measures of uncertainty using neural networks!
Basic Info
- Host: GitHub
- Owner: aamini
- License: apache-2.0
- Language: Python
- Default Branch: main
- Homepage: https://proceedings.neurips.cc/paper/2020/file/aab085461de182608ee9f607f3f7d18f-Paper.pdf
- Size: 9.39 MB
Statistics
- Stars: 491
- Watchers: 15
- Forks: 98
- Open Issues: 16
- Releases: 4
Topics
Metadata Files
README.md
Evidential Deep Learning
"All models are wrong, but some — that know when they can be trusted — are useful!"
- George Box (Adapted)

This repository contains the code to reproduce Deep Evidential Regression, as published in NeurIPS 2020, as well as more general code to leverage evidential learning to train neural networks to learn their own measures of uncertainty directly from data!
Setup
To use this package, you must install the following dependencies first: - python (>=3.7) - tensorflow (>=2.0) - pytorch (support coming soon)
Now you can install to start adding evidential layers and losses to your models!
pip install evidential-deep-learning
Now you're ready to start using this package directly as part of your existing tf.keras model pipelines (Sequential, Functional, or model-subclassing):
```
import evidentialdeeplearning as edl ```
Example
To use evidential deep learning, you must edit the last layer of your model to be evidential and use a supported loss function to train the system end-to-end. This repository supports evidential layers for both fully connected and convolutional (2D) layers. The evidential prior distribution presented in the paper follows a Normal Inverse-Gamma and can be added to your model:
``` import evidentialdeeplearning as edl import tensorflow as tf
model = tf.keras.Sequential( [ tf.keras.layers.Dense(64, activation="relu"), tf.keras.layers.Dense(64, activation="relu"), edl.layers.DenseNormalGamma(1), # Evidential distribution! ] ) model.compile( optimizer=tf.keras.optimizers.Adam(1e-3), loss=edl.losses.EvidentialRegression # Evidential loss! ) ```
Checkout hello_world.py for an end-to-end toy example walking through this step-by-step. For more complex examples, scaling up to computer vision problems (where we learn to predict tens of thousands of evidential distributions simultaneously!), please refer to the NeurIPS 2020 paper, and the reproducibility section of this repo to run those examples.
Reproducibility
All of the results published as part of our NeurIPS paper can be reproduced as part of this repository. Please refer to the reproducibility section for details and instructions to obtain each result.
Citation
If you use this code for evidential learning as part of your project or paper, please cite the following work:
@article{amini2020deep,
title={Deep evidential regression},
author={Amini, Alexander and Schwarting, Wilko and Soleimany, Ava and Rus, Daniela},
journal={Advances in Neural Information Processing Systems},
volume={33},
year={2020}
}
Owner
- Name: Alexander Amini
- Login: aamini
- Kind: user
- Website: http://www.mit.edu/~amini
- Repositories: 4
- Profile: https://github.com/aamini
Citation (CITATION.cff)
cff-version: "1.1.0"
message: "If you use this software, please cite it using these metadata."
title: "Deep Evidential Regression"
authors:
-
family-names: Amini
given-names: Alexander
orcid: "https://orcid.org/0000-0002-9673-1267"
-
family-names: Schwarting
given-names: Wilko
-
family-names: Soleimany
given-names: Ava
orcid: "https://orcid.org/0000-0002-8601-6040"
-
family-names: Rus
given-names: Daniela
conference: "Advances in Neural Information Processing Systems (NeurIPS)"
year: 2020
volume: 33
repository-code: "https://github.com/aamini/evidential-deep-learning"
GitHub Events
Total
- Watch event: 54
- Issue comment event: 1
- Fork event: 2
Last Year
- Watch event: 54
- Issue comment event: 1
- Fork event: 2
Committers
Last synced: 9 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Alexander Amini | x****i@g****m | 21 |
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 16
- Total pull requests: 3
- Average time to close issues: 11 days
- Average time to close pull requests: over 1 year
- Total issue authors: 16
- Total pull request authors: 3
- Average comments per issue: 1.13
- Average comments per pull request: 2.67
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- koo-ec (1)
- XxidroxX (1)
- PrinceDouble (1)
- jw3126 (1)
- TGISer (1)
- ml-mountainman (1)
- sabertwirl (1)
- Emmanuel-Messulam (1)
- PeterPirog (1)
- HareshKarnan (1)
- nilsleh (1)
- benjiachong (1)
- AdithyaVenkateshMohan (1)
- kpqn (1)
- muammar (1)
Pull Request Authors
- Kelvinthedrugger (1)
- Dariusrussellkish (1)
- wanzysky (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 339 last-month
- Total dependent packages: 0
- Total dependent repositories: 2
- Total versions: 3
- Total maintainers: 1
pypi.org: evidential-deep-learning
Learn fast, scalable, and calibrated measures of uncertainty using neural networks!
- Homepage: https://github.com/aamini/evidential-deep-learning
- Documentation: https://evidential-deep-learning.readthedocs.io/
- License: Apache License 2.0
-
Latest release: 0.4.0
published about 5 years ago