rexmex

A general purpose recommender metrics library for fair evaluation.

https://github.com/astrazeneca/rexmex

Science Score: 77.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: sciencedirect.com, wiley.com
  • Committers with academic emails
    1 of 9 committers (11.1%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.5%) to scientific vocabulary

Keywords

coverage deep-learning evaluation machine-learning metric metrics mrr personalization precision rank ranking recall recommender recommender-system recsys rsquared
Last synced: 6 months ago · JSON representation ·

Repository

A general purpose recommender metrics library for fair evaluation.

Basic Info
Statistics
  • Stars: 279
  • Watchers: 7
  • Forks: 26
  • Open Issues: 4
  • Releases: 17
Topics
coverage deep-learning evaluation machine-learning metric metrics mrr personalization precision rank ranking recall recommender recommender-system recsys rsquared
Created over 4 years ago · Last pushed over 2 years ago
Metadata Files
Readme Contributing License Citation

README.md

Version License repo size build badge codecov


reXmeX is recommender system evaluation metric library.

Please look at the Documentation and External Resources.

reXmeX consists of utilities for recommender system evaluation. First, it provides a comprehensive collection of metrics for the evaluation of recommender systems. Second, it includes a variety of methods for reporting and plotting the performance results. Implemented metrics cover a range of well-known metrics and newly proposed metrics from data mining (ICDM, CIKM, KDD) conferences and prominent journals.

Citing

If you find RexMex useful in your research, please consider adding the following citation:

```bibtex @inproceedings{rexmex, title = {{rexmex: A General Purpose Recommender Metrics Library for Fair Evaluation.}}, author = {Benedek Rozemberczki and Sebastian Nilsson and Piotr Grabowski and Charles Tapley Hoyt and Gavin Edwards}, year = {2021}, }

```

An introductory example

The following example loads a synthetic dataset which has the mandatory y_true and y_score keys. The dataset has binary labels and predictied probability scores. We read the dataset and define a defult ClassificationMetric instance for the evaluation of the predictions. Using this metric set we create a score card and get the predictive performance metrics.

```python from rexmex import ClassificationMetricSet, DatasetReader, ScoreCard

reader = DatasetReader() scores = reader.read_dataset()

metric_set = ClassificationMetricSet()

scorecard = ScoreCard(metricset)

report = scorecard.getperformancemetrics(scores["ytrue"], scores["y_score"]) ```


An advanced example

The following more advanced example loads the same synthetic dataset which has the source_id, target_id, source_group and target group keys besides the mandatory y_true and y_score. Using the source_group key we group the predictions and return a performance metric report.

```python from rexmex import ClassificationMetricSet, DatasetReader, ScoreCard

reader = DatasetReader() scores = reader.read_dataset()

metric_set = ClassificationMetricSet()

scorecard = ScoreCard(metricset)

report = scorecard.generatereport(scores, grouping=["source_group"]) ```


Scorecard

A rexmex score card allows the reporting of recommender system performance metrics, plotting the performance metrics and saving those. Our framework provides 7 rating, 38 classification, 18 ranking, and 2 coverage metrics.

Metric Sets

Metric sets allow the users to calculate a range of evaluation metrics for a label - predicted label vector pair. We provide a general MetricSet class and specialized metric sets with pre-set metrics have the following general categories:

  • Ranking
  • Rating
  • Classification
  • Coverage

Ranking Metric Set

Expand to see all ranking metrics in the metric set. * **[Mean Reciprocal Rank (MRR)](https://en.wikipedia.org/wiki/Mean_reciprocal_rank)** * **[Spearmanns Rho](https://en.wikipedia.org/wiki/Spearman%27s_rank_correlation_coefficient)** * **[Kendall Tau](https://en.wikipedia.org/wiki/Kendall_rank_correlation_coefficient)** * **[HITS@k](https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval))** * **[Novelty](https://www.sciencedirect.com/science/article/pii/S163107051930043X)** * **[Average Recall @ k](https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval))** * **[Mean Average Recall @ k](https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval))** * **[Average Precision @ k](https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval))** * **[Mean Average Precision @ k](https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval))** * **[Personalisation](http://www.mavir.net/docs/tfm-vargas-sandoval.pdf)** * **[Intra List Similarity](http://www.mavir.net/docs/tfm-vargas-sandoval.pdf)**

Rating Metric Set

These metrics assume that items are scored explicitly and ratings are predicted by a regression model.

Expand to see all rating metrics in the metric set. * **[Symmetric Mean Absolute Percentage Error (SMAPE)](https://en.wikipedia.org/wiki/Symmetric_mean_absolute_percentage_error)** * **[Pearson Correlation](https://en.wikipedia.org/wiki/Pearson_correlation_coefficient)** * **[Coefficient of Determination](https://en.wikipedia.org/wiki/Coefficient_of_determination)**

Classification Metric Set

These metrics assume that the items are scored with raw probabilities (these can be binarized).

Expand to see all classification metrics in the metric set. * **[F-1 Score](https://en.wikipedia.org/wiki/F-score)** * **[Average Precision](https://scikit-learn.org/stable/modules/generated/sklearn.metrics.average_precision_score.html)** * **[Specificty (Selectivity or True Negative Rate )](https://en.wikipedia.org/wiki/Precision_and_recall)** * **[Matthew's Correlation](https://en.wikipedia.org/wiki/Precision_and_recall)** * **[Accuracy](https://en.wikipedia.org/wiki/Precision_and_recall)** * **[Balanced Accuracy](https://en.wikipedia.org/wiki/Precision_and_recall)** * **[Fowlkes-Mallows Index](https://en.wikipedia.org/wiki/Precision_and_recall)**

Coverage Metric Set

These metrics measure how well the recommender system covers the available items in the catalog and possible users. In other words measure the diversity of predictions.


Documentation and Reporting Issues

Head over to our documentation to find out more about installation and data handling, a full list of implemented methods, and datasets.

If you notice anything unexpected, please open an issue and let us know. If you are missing a specific method, feel free to open a feature request. We are motivated to constantly make RexMex even better.


Installation via the command line

RexMex can be installed with the following command after the repo is cloned.

sh $ pip install .

Use -e/--editable when developing.

Installation via pip

RexMex can be installed with the following pip command.

sh $ pip install rexmex

As we create new releases frequently, upgrading the package casually might be beneficial.

sh $ pip install rexmex --upgrade


Running tests

Tests can be run with tox with the following:

sh $ pip install tox $ tox -e py


Citation

If you use RexMex in a scientific publication, we would appreciate citations. Please see GitHub's built-in citation tool.


License

Owner

  • Name: AstraZeneca
  • Login: AstraZeneca
  • Kind: organization
  • Location: Global

Data and AI: Unlocking new science insights

Citation (CITATION.cff)

cff-version: 0.1.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Rozemberczki"
  given-names: "Benedek"
  orcid: "https://orcid.org/0000-0000-0000-0000"
- family-names: "Nilsson"
  given-names: "Sebastian"
  orcid: "https://orcid.org/0000-0002-7449-466X"
- family-names: "Hoyt"
  given-names: "Charles Tapley"
  orcid: "https://orcid.org/0000-0003-4423-4370"
- family-names: "Edwards"
  given-names: "Gavin"
  orcid: "https://orcid.org/0000-0003-0630-5529"
title: "RexMex"
version: 0.1.0
# doi: 10.5281/zenodo.1234
# date-released: 2021-12-01
url: "https://github.com/AstraZeneca/rexmex"

GitHub Events

Total
  • Watch event: 6
  • Issue comment event: 1
  • Fork event: 1
Last Year
  • Watch event: 6
  • Issue comment event: 1
  • Fork event: 1

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 482
  • Total Committers: 9
  • Avg Commits per committer: 53.556
  • Development Distribution Score (DDS): 0.29
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Rozemberczki k****8@a****t 342
Charles Tapley Hoyt c****t@g****m 61
Piotr Grabowski p****i@a****m 42
Sebastian Nilsson s****n@a****m 27
Gavin Edwards 2****s 6
sbonner0 s****1@a****m 1
Michaël Ughetto m****o@g****m 1
Daniel Obraczka o****a@i****e 1
Ughetto, Michaël m****o@a****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 19
  • Total pull requests: 43
  • Average time to close issues: 26 days
  • Average time to close pull requests: about 1 month
  • Total issue authors: 9
  • Total pull request authors: 9
  • Average comments per issue: 0.84
  • Average comments per pull request: 1.3
  • Merged pull requests: 38
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 1
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 1
  • Average comments per issue: 0
  • Average comments per pull request: 1.0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • benedekrozemberczki (7)
  • cthoyt (5)
  • dobraczka (1)
  • AyushExel (1)
  • DimitrisAlivas (1)
  • michellekarunaratne (1)
  • Montana (1)
  • GavEdwards (1)
  • sebastiandro (1)
Pull Request Authors
  • cthoyt (15)
  • benedekrozemberczki (10)
  • kajocina (8)
  • GavEdwards (4)
  • sebastiandro (2)
  • sbonner0 (1)
  • dobraczka (1)
  • ivanmilevtues (1)
  • mughetto (1)
Top Labels
Issue Labels
bug (1) good first issue (1)
Pull Request Labels
enhancement (1)

Packages

  • Total packages: 14
  • Total downloads:
    • pypi 568 last-month
  • Total docker downloads: 10
  • Total dependent packages: 1
    (may contain duplicates)
  • Total dependent repositories: 9
    (may contain duplicates)
  • Total versions: 39
  • Total maintainers: 6
pypi.org: rexmex

A General Purpose Recommender Metrics Library for Fair Evaluation.

  • Versions: 19
  • Dependent Packages: 1
  • Dependent Repositories: 9
  • Downloads: 568 Last month
  • Docker Downloads: 10
Rankings
Stargazers count: 3.9%
Downloads: 4.0%
Docker downloads count: 4.0%
Dependent packages count: 4.7%
Average: 4.9%
Dependent repos count: 4.9%
Forks count: 7.7%
Last synced: 6 months ago
alpine-v3.18: py3-rexmex

A general purpose recommender metrics library for fair evaluation

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 9.0%
Stargazers count: 14.8%
Forks count: 21.0%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.18: py3-rexmex-pyc

Precompiled Python bytecode for py3-rexmex

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 9.0%
Stargazers count: 14.8%
Forks count: 21.0%
Maintainers (1)
Last synced: 6 months ago
alpine-edge: py3-rexmex

A general purpose recommender metrics library for fair evaluation

  • Versions: 5
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Average: 13.1%
Dependent packages count: 14.6%
Stargazers count: 16.0%
Forks count: 21.6%
Maintainers (1)
Last synced: 6 months ago
alpine-edge: py3-rexmex-pyc

Precompiled Python bytecode for py3-rexmex

  • Versions: 4
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Average: 13.3%
Dependent packages count: 14.1%
Stargazers count: 16.7%
Forks count: 22.5%
Maintainers (1)
Last synced: 6 months ago
conda-forge.org: rexmex

**reXmeX** is a recommender system evaluation metric library. Please look at the **[Documentation](https://rexmex.readthedocs.io/en/latest/)** and **[External Resources](https://rexmex.readthedocs.io/en/latest/notes/resources.html)**. **reXmeX** consists of utilities for recommender system evaluation. First, it provides a comprehensive collection of metrics for the evaluation of recommender systems. Second, it includes a variety of methods for reporting and plotting the performance results. Implemented metrics cover a range of well-known metrics and newly proposed metrics from data mining ([ICDM](http://icdm2019.bigke.org/), [CIKM](http://www.cikm2019.net/), [KDD](https://www.kdd.org/kdd2020/)) conferences and prominent journals. PyPI: [https://pypi.org/project/rexmex/](https://pypi.org/project/rexmex/)

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Stargazers count: 21.7%
Forks count: 32.6%
Dependent repos count: 34.0%
Average: 34.9%
Dependent packages count: 51.2%
Last synced: 6 months ago
alpine-v3.21: py3-rexmex-pyc

Precompiled Python bytecode for py3-rexmex

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.19: py3-rexmex-pyc

Precompiled Python bytecode for py3-rexmex

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.22: py3-rexmex

A general purpose recommender metrics library for fair evaluation

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.21: py3-rexmex

A general purpose recommender metrics library for fair evaluation

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.22: py3-rexmex-pyc

Precompiled Python bytecode for py3-rexmex

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.20: py3-rexmex

A general purpose recommender metrics library for fair evaluation

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.19: py3-rexmex

A general purpose recommender metrics library for fair evaluation

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.20: py3-rexmex-pyc

Precompiled Python bytecode for py3-rexmex

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago

Dependencies

docs/requirements_1.txt pypi
  • jupyter-sphinx *
  • nbsphinx *
  • nbsphinx_link *
  • numpy *
  • pandas *
  • scipy *
  • six *
  • sklearn *
  • sphinx ==4.0.2
  • sphinx_rtd_theme ==0.5.2
  • tqdm *
setup.py pypi
  • numpy *
.github/workflows/main.yaml actions
  • actions/checkout v2 composite
  • codecov/codecov-action v1 composite
  • conda-incubator/setup-miniconda v2 composite
pyproject.toml pypi