PerMetrics

PerMetrics: A Framework of Performance Metrics for Machine Learning Models - Published in JOSS (2024)

https://github.com/thieu1995/permetrics

Science Score: 100.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 8 DOI reference(s) in README and JOSS metadata
  • Academic publication links
    Links to: joss.theoj.org, zenodo.org
  • Committers with academic emails
    1 of 5 committers (20.0%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
    Published in Journal of Open Source Software

Keywords

classification classification-report clustering-evaluation coefficient-of-determination deviation-of-runoff-volume evaluation-metrics kling-gupta-efficiency mae nash-sutcliffe-efficiency performance-metrics regression regression-methods relative-error rmse symmetric-mean-absolute-percentage willmott-index

Keywords from Contributors

mesh

Scientific Fields

Mathematics Computer Science - 84% confidence
Last synced: 4 months ago · JSON representation ·

Repository

Artificial intelligence (AI, ML, DL) performance metrics implemented in Python

Basic Info
Statistics
  • Stars: 80
  • Watchers: 3
  • Forks: 19
  • Open Issues: 6
  • Releases: 24
Topics
classification classification-report clustering-evaluation coefficient-of-determination deviation-of-runoff-volume evaluation-metrics kling-gupta-efficiency mae nash-sutcliffe-efficiency performance-metrics regression regression-methods relative-error rmse symmetric-mean-absolute-percentage willmott-index
Created over 5 years ago · Last pushed over 1 year ago
Metadata Files
Readme Changelog Contributing License Code of conduct Citation

README.md

PERMETRICS


GitHub release Wheel PyPI version PyPI - Python Version PyPI - Status PyPI - Downloads Downloads Tests & Publishes to PyPI GitHub Release Date Documentation Status Chat GitHub contributors GitTutorial DOI JOSS License: GPL v3

PerMetrics is a python library for performance metrics of machine learning models. We aim to implement all performance metrics for problems such as regression, classification, clustering, ... problems. Helping users in all field access metrics as fast as possible. The number of available metrics include 111 (47 regression metrics, 20 classification metrics, 44 clustering metrics)

Citation Request

Please include these citations if you plan to use this library:

  • LaTeX:

bibtex @article{Thieu_PerMetrics_A_Framework_2024, author = {Thieu, Nguyen Van}, doi = {10.21105/joss.06143}, journal = {Journal of Open Source Software}, month = mar, number = {95}, pages = {6143}, title = {{PerMetrics: A Framework of Performance Metrics for Machine Learning Models}}, url = {https://joss.theoj.org/papers/10.21105/joss.06143}, volume = {9}, year = {2024} }

  • APA:

Thieu, N. V. (2024). PerMetrics: A Framework of Performance Metrics for Machine Learning Models. Journal of Open Source Software, 9(95), 6143. https://doi.org/10.21105/joss.06143

Installation

Install the current PyPI release: sh $ pip install permetrics

After installation, you can import Permetrics as any other Python module:

```sh $ python

import permetrics permetrics.version ```

Example

Below is the most efficient and effective way to use this library compared to other libraries. The example below returns the values of metrics such as root mean squared error, mean absolute error...

```python from permetrics import RegressionMetric

ytrue = [3, -0.5, 2, 7] ypred = [2.5, 0.0, 2, 8]

evaluator = RegressionMetric(ytrue, ypred) results = evaluator.getmetricsbylistnames(["RMSE", "MAE", "MAPE", "R2", "NSE", "KGE"]) print(results["RMSE"]) print(results["KGE"]) ```

In case your ytrue and ypred data have multiple columns, and you want to return multiple outputs, something that other libraries cannot do, you can do it in Permetrics as follows:

```python import numpy as np from permetrics import RegressionMetric

ytrue = np.array([[0.5, 1], [-1, 1], [7, -6]]) ypred = np.array([[0, 2], [-1, 2], [8, -5]])

evaluator = RegressionMetric(ytrue, ypred)

The 1st way

results = evaluator.getmetricsbydict({ "RMSE": {"multioutput": "rawvalues"}, "MAE": {"multioutput": "rawvalues"}, "MAPE": {"multioutput": "raw_values"}, })

The 2nd way

results = evaluator.getmetricsbylistnames( listmetricnames=["RMSE", "MAE", "MAPE", "R2", "NSE", "KGE"], listparas=[{"multioutput": "raw_values"},] * 6 )

The 3rd way

result01 = evaluator.RMSE(multioutput="rawvalues") result02 = evaluator.MAE(multioutput="rawvalues") ```

The more complicated cases in the folder: examples. You can also read the documentation for more detailed installation instructions, explanations, and examples.

Contributing

There are lots of ways how you can contribute to Permetrics's development, and you are welcome to join in! For example, you can report problems or make feature requests on the issues pages. To facilitate contributions, please check for the guidelines in the CONTRIBUTING.md file.

Official channels

Note

  • Currently, there is a huge misunderstanding among frameworks around the world about the notation of R, R2, and R^2.
  • Please read the file R-R2-Rsquared.docx to understand the differences between them and why there is such confusion.

  • My recommendation is to denote the Coefficient of Determination as COD or R2, while the squared Pearson's Correlation Coefficient should be denoted as R^2 or RSQ (as in Excel software).


Developed by: Thieu @ 2023

Owner

  • Name: Nguyen Van Thieu
  • Login: thieu1995
  • Kind: user
  • Location: Earth
  • Company: AIIR Group

Knowledge is power, sharing it is the premise of progress in life. It seems like a burden to someone, but it is the only way to achieve immortality.

JOSS Publication

PerMetrics: A Framework of Performance Metrics for Machine Learning Models
Published
March 09, 2024
Volume 9, Issue 95, Page 6143
Authors
Nguyen Van Thieu ORCID
Faculty of Computer Science, Phenikaa University, Yen Nghia, Ha Dong, Hanoi, 12116, Vietnam.
Editor
Gabriela Alessio Robles ORCID
Tags
model assessment tools performance metrics classification validation metrics regression evaluation criteria clustering criterion indices machine learning metrics

Citation (CITATION.cff)

cff-version: "1.2.0"
authors:
- family-names: Thieu
  given-names: Nguyen Van
  orcid: "https://orcid.org/0000-0001-9994-8747"
doi: 10.5281/zenodo.3951205
message: If you use this software, please cite our article in the
  Journal of Open Source Software.
preferred-citation:
  authors:
  - family-names: Thieu
    given-names: Nguyen Van
    orcid: "https://orcid.org/0000-0001-9994-8747"
  date-published: 2024-03-09
  doi: 10.21105/joss.06143
  issn: 2475-9066
  issue: 95
  journal: Journal of Open Source Software
  publisher:
    name: Open Journals
  start: 6143
  title: "PerMetrics: A Framework of Performance Metrics for Machine
    Learning Models"
  type: article
  url: "https://joss.theoj.org/papers/10.21105/joss.06143"
  volume: 9
title: "PerMetrics: A Framework of Performance Metrics for Machine
  Learning Models"

GitHub Events

Total
  • Issues event: 2
  • Watch event: 9
  • Issue comment event: 1
  • Fork event: 2
Last Year
  • Issues event: 2
  • Watch event: 9
  • Issue comment event: 1
  • Fork event: 2

Committers

Last synced: 5 months ago

All Time
  • Total Commits: 463
  • Total Committers: 5
  • Avg Commits per committer: 92.6
  • Development Distribution Score (DDS): 0.063
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Thieu Nguyen n****2@g****m 434
Thieu t****n@p****n 15
Matthias Quinn m****n@y****m 12
dependabot[bot] 4****] 1
Gabby a****a@g****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 4 months ago

All Time
  • Total issues: 7
  • Total pull requests: 4
  • Average time to close issues: 5 months
  • Average time to close pull requests: 14 days
  • Total issue authors: 6
  • Total pull request authors: 2
  • Average comments per issue: 1.0
  • Average comments per pull request: 0.25
  • Merged pull requests: 2
  • Bot issues: 0
  • Bot pull requests: 3
Past Year
  • Issues: 2
  • Pull requests: 1
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 2
  • Pull request authors: 1
  • Average comments per issue: 0.0
  • Average comments per pull request: 0.0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 1
Top Authors
Issue Authors
  • N3uralN3twork (2)
  • 321HG (1)
  • kunzrp (1)
  • chris-fj (1)
  • NamitaBajpai (1)
  • wasf84 (1)
Pull Request Authors
  • dependabot[bot] (4)
  • galessiorob (2)
Top Labels
Issue Labels
bug (3) enhancement (1)
Pull Request Labels
dependencies (4) github_actions (2)

Packages

  • Total packages: 14
  • Total downloads:
    • pypi 28,159 last-month
  • Total dependent packages: 8
    (may contain duplicates)
  • Total dependent repositories: 1
    (may contain duplicates)
  • Total versions: 53
  • Total maintainers: 2
pypi.org: permetrics

PerMetrics: A Framework of Performance Metrics for Machine Learning Models

  • Versions: 20
  • Dependent Packages: 8
  • Dependent Repositories: 1
  • Downloads: 28,159 Last month
Rankings
Dependent packages count: 1.4%
Downloads: 6.5%
Average: 9.9%
Stargazers count: 10.1%
Forks count: 10.2%
Dependent repos count: 21.6%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.18: py3-permetrics

Artificial intelligence (AI, ML, DL) performance metrics implemented in Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 12.3%
Stargazers count: 24.3%
Forks count: 25.0%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.18: py3-permetrics-pyc

Precompiled Python bytecode for py3-permetrics

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 12.3%
Stargazers count: 24.3%
Forks count: 25.0%
Maintainers (1)
Last synced: 4 months ago
alpine-edge: py3-permetrics

Artificial intelligence (AI, ML, DL) performance metrics implemented in Python

  • Versions: 12
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 14.6%
Average: 16.9%
Forks count: 26.1%
Stargazers count: 26.8%
Maintainers (1)
Last synced: 4 months ago
alpine-edge: py3-permetrics-pyc

Precompiled Python bytecode for py3-permetrics

  • Versions: 10
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 13.4%
Average: 17.2%
Forks count: 27.7%
Stargazers count: 27.9%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.17: py3-permetrics

Artificial intelligence (AI, ML, DL) performance metrics implemented in Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Average: 18.1%
Forks count: 22.5%
Stargazers count: 22.8%
Dependent packages count: 27.3%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.21: py3-permetrics-pyc

Precompiled Python bytecode for py3-permetrics

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.21: py3-permetrics

Artificial intelligence (AI, ML, DL) performance metrics implemented in Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.22: py3-permetrics-pyc

Precompiled Python bytecode for py3-permetrics

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.20: py3-permetrics

Artificial intelligence (AI, ML, DL) performance metrics implemented in Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.19: py3-permetrics

Artificial intelligence (AI, ML, DL) performance metrics implemented in Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.22: py3-permetrics

Artificial intelligence (AI, ML, DL) performance metrics implemented in Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.19: py3-permetrics-pyc

Precompiled Python bytecode for py3-permetrics

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 4 months ago
alpine-v3.20: py3-permetrics-pyc

Precompiled Python bytecode for py3-permetrics

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 4 months ago

Dependencies

docs/requirements.txt pypi
  • numpy ==1.15.1
  • readthedocs-sphinx-search ==0.1.1
  • sphinx ==4.4.0
  • sphinx_rtd_theme ==1.0.0
setup.py pypi
  • numpy >=1.15.1
.github/workflows/draft-pdf.yml actions
  • actions/checkout v3 composite
  • actions/upload-artifact v1 composite
  • openjournals/openjournals-draft-action master composite
  • stefanzweifel/git-auto-commit-action v4 composite
.github/workflows/publish-package.yaml actions
  • actions/cache v1 composite
  • actions/checkout v1 composite
  • actions/download-artifact v2 composite
  • actions/setup-python v1 composite
  • actions/upload-artifact master composite
  • pypa/gh-action-pypi-publish master composite
requirements.txt pypi
  • flake8 >=4.0.1
  • numpy >=1.17.1
  • pytest ==7.1.2
  • pytest-cov ==4.0.0
  • scipy >=1.7.1