ndd
Bayesian entropy estimation in Python - via the Nemenman-Schafee-Bialek algorithm
Science Score: 10.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
○codemeta.json file
-
○.zenodo.json file
-
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (14.1%) to scientific vocabulary
Keywords
Repository
Bayesian entropy estimation in Python - via the Nemenman-Schafee-Bialek algorithm
Basic Info
Statistics
- Stars: 52
- Watchers: 1
- Forks: 4
- Open Issues: 3
- Releases: 0
Topics
Metadata Files
README.rst
====================================================
ndd - Bayesian entropy estimation from discrete data
====================================================
.. image:: https://badge.fury.io/py/ndd.svg
:target: https://badge.fury.io/py/ndd
.. image:: https://travis-ci.com/simomarsili/ndd.svg?branch=master
:target: https://travis-ci.com/simomarsili/ndd
``ndd`` is a Python package for Bayesian entropy estimation from discrete
data. ``ndd`` provides the ``ndd.entropy`` function, a Bayesian replacement
for the ``scipy.stats.entropy`` function from the SciPy library,
based on an efficient implementation of the
`Nemenman-Schafee-Bialek (NSB) algorithm
`_.
Remarkably, the NSB algorithm allows entropy estimation when the number of
samples is much smaller than the number of classes with non-zero probability.
Basic usage
===========
The ``entropy`` function takes as input a vector of **frequency counts**
(the observed frequencies for a set of classes or states) and an **alphabet size**
(the number of classes with non-zero probability, including unobserved classes)
and returns an entropy estimate (in nats)::
>>> import ndd
>>> counts = [4, 12, 4, 5, 3, 1, 5, 1, 2, 2, 2, 2, 11, 3, 4, 12, 12, 1, 2]
>>> ndd.entropy(counts, k=100)
2.8060922529931225
The uncertainty in the entropy estimate can be quantified using the
posterior standard deviation (see Eq. 13 in `Archer 2013
`_) ::
>>> ndd.entropy(counts, k=100, return_std=True)
(2.8060922529931225, 0.11945501149743358)
If the alphabet size is unknown or countably infinite, the ``k`` argument can
be omitted and the ``entropy`` function will either use an upper bound estimate
for ``k``, or switch to the asymptotic NSB estimator for strongly undersampled
distributions (Equations 29, 30 in
`Nemenman 2011 `_) ::
>>> import ndd
>>> counts = [4, 12, 4, 5, 3, 1, 5, 1, 2, 2, 2, 2, 11, 3, 4, 12, 12, 1, 2]
>>> ndd.entropy(counts) # k is omitted
2.8130746489179046
Where to get it
===============
**conda**
---------
The easiest way to install ``ndd`` is via the ``conda`` package manager.
Packages are provided on the ``conda-forge`` Anaconda Cloud channel for Linux,
OS X, and Win platforms.
Install the latest stable release using ``conda`` with::
conda install --channel conda-forge ndd
**pip**
-------
Install using pip with::
pip3 install -U ndd
or directly from sources in github for the latest version of the code::
pip3 install git+https://github.com/simomarsili/ndd.git
In order to build ``ndd`` with ``pip``, you will need ``numpy`` (>= 1.13) and a
**Fortran compiler** installed on your machine.
If you are using Debian or a Debian derivative such as Ubuntu,
you can install the gfortran compiler using the following command::
sudo apt-get install gfortran
On Windows, you can use the gfortran compiler from the
`MinGW-w64 `_ project
(`direct link `_
to the installer).
Changes
=======
**v1.10.5**
Added `ndd` to the anaconda `conda-forge` channel.
**v1.10**
Changed:
the signature of the `entropy` function is:::
entropy(nk, k=None, estimator=None, return_std=False)
**v1.9**
Changed:
if argument ``k`` is omitted, the ``entropy`` function will guess a
reasonable alphabet size and select the best estimator for the sampling
regime.
**v.1.8.3**
Fixed:
integration for huge cardinalities
**v1.8**
Added:
full Bayesian error estimate (from direct computation of the posterior
variance of the entropy)
**v1.7**
Changed:
estimation is much faster (removed unnecessary checks on input counts)
``entropy()`` function needs cardinality ``k`` for the default (NSB)
estimator
**v1.6.1**
Changed:
Fixed numerical integration for large alphabet sizes.
**v1.6**
Changed:
The signature of the ``entropy`` function has been changed to allow
arbitrary entropy estimators. The new signature is::
entropy(pk, k=None, estimator='NSB', return_std=False)
The available estimators are::
>>> import ndd
>>> ndd.entropy_estimators
['Plugin', 'MillerMadow', 'NSB', 'AsymptoticNSB', 'Grassberger']
Check the function docstring for details.
Added:
- *MillerMadow* estimator class
- *AsymptoticNSB* estimator class
- *Grassberger* estimator class
**v1.5**
For methods/functions working on data matrices:
the default input is a **n-by-p** 2D array (n samples from p discrete
variables, with different samples on different **rows**).
Since release 1.3, the default was a transposed (**p-by-n**) data matrix.
The behavior of functions taking frequency counts as input
(e.g. the *entropy* function) is unchanged.
**v1.4**
Added the *kullback_leibler_divergence* function.
**v1.1**
Added:
* *from_data*
* *mutual_information*
* *conditional_information*
* *interaction_information*
* *coinformation*
**v1.0**
Drop support for Python < 3.4.
**v0.9**
Added the `jensen_shannnon_divergence` function.
References
==========
Some refs::
@article{wolpert1995estimating,
title={Estimating functions of probability distributions from a finite set of samples},
author={Wolpert, David H and Wolf, David R},
journal={Physical Review E},
volume={52},
number={6},
pages={6841},
year={1995},
publisher={APS}
}
@inproceedings{nemenman2002entropy,
title={Entropy and inference, revisited},
author={Nemenman, Ilya and Shafee, Fariel and Bialek, William},
booktitle={Advances in neural information processing systems},
pages={471--478},
year={2002}
}
@article{paninski2003estimation,
title={Estimation of entropy and mutual information},
author={Paninski, Liam},
journal={Neural computation},
volume={15},
number={6},
pages={1191--1253},
year={2003},
publisher={MIT Press}
}
@article{nemenman2004entropy,
title={Entropy and information in neural spike trains: Progress on the sampling problem},
author={Nemenman, Ilya and Bialek, William and van Steveninck, Rob de Ruyter},
journal={Physical Review E},
volume={69},
number={5},
pages={056111},
year={2004},
publisher={APS}
}
@article{nemenman2011coincidences,
title={Coincidences and estimation of entropies of random variables with large cardinalities},
author={Nemenman, Ilya},
journal={Entropy},
volume={13},
number={12},
pages={2013--2023},
year={2011},
publisher={Molecular Diversity Preservation International}
}
@article{archer2013bayesian,
title={Bayesian and quasi-Bayesian estimators for mutual information from discrete data},
author={Archer, Evan and Park, Il Memming and Pillow, Jonathan W},
journal={Entropy},
volume={15},
number={5},
pages={1738--1755},
year={2013},
publisher={Multidisciplinary Digital Publishing Institute}
}
@article{archer2014bayesian,
title={Bayesian entropy estimation for countable discrete distributions},
author={Archer, Evan and Park, Il Memming and Pillow, Jonathan W},
journal={The Journal of Machine Learning Research},
volume={15},
number={1},
pages={2833--2868},
year={2014},
publisher={JMLR. org}
}
and interesting links:
- `Sebastian Nowozin on Bayesian estimators `_
- `Il Memming Park on discrete entropy estimators `_
Contributing
============
**ndd** is an OPEN Source Project so please help out by `reporting bugs `_ or forking and opening pull requests when possible.
License
=======
Copyright (c) 2016-2019, Simone Marsili.
All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
Owner
- Name: Simone Marsili
- Login: simomarsili
- Kind: user
- Repositories: 4
- Profile: https://github.com/simomarsili
GitHub Events
Total
- Issues event: 1
- Watch event: 7
- Issue comment event: 1
Last Year
- Issues event: 1
- Watch event: 7
- Issue comment event: 1
Committers
Last synced: over 2 years ago
Top Committers
| Name | Commits | |
|---|---|---|
| simomarsili | s****i@g****m | 740 |
| simomarsili | s****i@g****m | 406 |
| Ciro Cattuto | c****o@g****m | 2 |
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 6
- Total pull requests: 2
- Average time to close issues: 5 days
- Average time to close pull requests: about 24 hours
- Total issue authors: 6
- Total pull request authors: 1
- Average comments per issue: 2.5
- Average comments per pull request: 0.0
- Merged pull requests: 1
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 1
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 1
- Pull request authors: 0
- Average comments per issue: 0.0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- epiasini (1)
- sehoffmann (1)
- sjfleming (1)
- firmai (1)
- ccattuto (1)
- YutaoChen0512 (1)
Pull Request Authors
- ccattuto (2)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 2
-
Total downloads:
- pypi 147 last-month
-
Total dependent packages: 0
(may contain duplicates) -
Total dependent repositories: 1
(may contain duplicates) - Total versions: 43
- Total maintainers: 1
pypi.org: ndd
Bayesian entropy estimation from discrete data
- Homepage: https://github.com/simomarsili/ndd
- Documentation: https://ndd.readthedocs.io/
- License: BSD 3-Clause
-
Latest release: 1.10.6
published about 5 years ago
Rankings
Maintainers (1)
conda-forge.org: ndd
ndd is a Python package for Bayesian entropy estimation from discrete data. ndd provides the ndd.entropy function, based on an efficient implementation of the Nemenman-Schafee-Bialek (NSB) algorithm. Remarkably, the NSB algorithm allows entropy estimation when the number of samples is much smaller than the number of classes with non-zero probability.
- Homepage: https://github.com/simomarsili/ndd
- License: BSD-3-Clause
-
Latest release: 1.10.6
published about 5 years ago
Rankings
Dependencies
- numpy >=1.13 development
- pytest * development
- numpy >=1.17