pycalib

Python library for classifier calibration

https://github.com/classifier-calibration/pycalib

Science Score: 64.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: zenodo.org
  • Committers with academic emails
    1 of 2 committers (50.0%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (16.1%) to scientific vocabulary

Keywords

calibration classifier classifier-training machine-learning optimal-decision-making probabilistic-models python
Last synced: 6 months ago · JSON representation ·

Repository

Python library for classifier calibration

Basic Info
Statistics
  • Stars: 17
  • Watchers: 1
  • Forks: 2
  • Open Issues: 15
  • Releases: 1
Topics
calibration classifier classifier-training machine-learning optimal-decision-making probabilistic-models python
Created over 5 years ago · Last pushed almost 2 years ago
Metadata Files
Readme License Citation

README.md

CI Documentation License BSD3 Python3.8 pypi codecov DOI

PyCalib

Python library for classifier calibration

User installation

The PyCalib package can be installed from Pypi with the command

pip install pycalib

Documentation

The documentation can be found at https://classifier-calibration.github.io/PyCalib/

Development

There is a make file to automate some of the common tasks during development. After downloading the repository create the virtual environment with the command

make venv

This will create a venv folder in your current folder. The environment needs to be loaded out of the makefile with

source venv/bin/activate

After the environment is loaded, all dependencies can be installed with

make requirements-dev

Unittest

Unittests are specified as doctest examples in simple functions (see example ), and more complex tests in their own python files starting with test_ (see example ).

Run the unittest with the command

make test

The test will show a unittest result including the coverage of the code. Ideally we want to increase the coverage to cover most of the library.

Contiunous Integration

Every time a commit is pushed to the master branch a unittest is run following the workflow .github/workflows/ci.yml. The CI badge in the README file will show if the test has passed or not.

Analyse code

We are trying to follow the same code standards as in Numpy and Scikit-learn, it is possible to check for pep8 and other code conventions with

make code-analysis

Documentation

The documentation can be found at https://www.classifier-calibration.com/PyCalib/, and it is automatically updated after every push to the master branch.

All documentation is done ussing the Sphinx documentation generator. The documentation is written in reStructuredText (*.rst) files in the docs/source folder. We try to follow the conventions from Numpy and Scikit-learn.

The examples with images in folder docs/source/examples are generated automatically with Sphinx-gallery from the python code in folder examples/ starting with xmpl_{example_name}.py.

The docuemnation can be build with the command

make doc

(Keep in mind that the documentation has its own Makefile inside folder docs).

After building the documentation, a new folder should appear in docs/build/ with an index.html that can be opened locally for further exploration.

The documentation is always build and deployed every time a new commit is pushed to the master branch with the workflow .github/workflows/documentation.yml.

After building, the docs/build/html folder is pushed to the branch gh-pages.

Check Readme

It is possible to check that the README file passes some tests for Pypi by running

make check-readme

Upload to PyPi

After testing that the code passes all unittests and upgrading the version in the file pycalib/__init__.py the code can be published in Pypi with the following command:

make pypi

It may require user and password if these are not set in your home directory a file .pypirc

[pypi] username = __token__ password = pypi-yourtoken

Contributors

This code has been adapted by Miquel from several previous codes. The following is a list of people that has been involved in some parts of the code.

  • Miquel Perello Nieto
  • Hao Song
  • Telmo Silva Filho
  • Markus Kängsepp

Owner

  • Name: classifier-calibration
  • Login: classifier-calibration
  • Kind: organization

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Perello-Nieto"
  given-names: "Miquel"
  orcid: "https://orcid.org/0000-0001-8925-424X"
- family-names: "Song"
  given-names: "Hao"
- family-names: "Silva-Filho"
  given-names: "Telmo"
- family-names: "Kängsepp"
  given-names: "Markus"
title: "PyCalib a library for classifier calibration"
version: 0.1.0.dev0
doi: 10.5281/zenodo.5518877
date-released: 2021-08-20
url: "https://github.com/perellonieto/PyCalib"

GitHub Events

Total
  • Watch event: 3
Last Year
  • Watch event: 3

Committers

Last synced: about 2 years ago

All Time
  • Total Commits: 140
  • Total Committers: 2
  • Avg Commits per committer: 70.0
  • Development Distribution Score (DDS): 0.021
Past Year
  • Commits: 23
  • Committers: 2
  • Avg Commits per committer: 11.5
  • Development Distribution Score (DDS): 0.13
Top Committers
Name Email Commits
Miquel Perelló Nieto p****o@g****m 137
PGijsbers p****s@t****l 3
Committer Domains (Top 20 + Academic)
tue.nl: 1

Issues and Pull Requests

Last synced: about 2 years ago

All Time
  • Total issues: 16
  • Total pull requests: 2
  • Average time to close issues: 1 day
  • Average time to close pull requests: about 1 month
  • Total issue authors: 2
  • Total pull request authors: 2
  • Average comments per issue: 0.06
  • Average comments per pull request: 1.0
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 1
Past Year
  • Issues: 1
  • Pull requests: 1
  • Average time to close issues: 1 day
  • Average time to close pull requests: 2 months
  • Issue authors: 1
  • Pull request authors: 1
  • Average comments per issue: 1.0
  • Average comments per pull request: 1.0
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • perellonieto (15)
  • Rahkovsky (1)
Pull Request Authors
  • dependabot[bot] (1)
  • PGijsbers (1)
Top Labels
Issue Labels
enhancement (11) documentation (7) question (5)
Pull Request Labels
dependencies (1)