trustpy-tools
TrustPy is a production-ready Python package purpose-built for MLOps pipelines—enabling automated, interpretable analysis of model trustworthiness and predictive reliability before deployment. Available via Conda-Forge and PyPI, with full CI/CD integration and seamless compatibility across modern ML stacks.
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (15.4%) to scientific vocabulary
Keywords
Repository
TrustPy is a production-ready Python package purpose-built for MLOps pipelines—enabling automated, interpretable analysis of model trustworthiness and predictive reliability before deployment. Available via Conda-Forge and PyPI, with full CI/CD integration and seamless compatibility across modern ML stacks.
Basic Info
Statistics
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 15
Topics
Metadata Files
README.md
TrustPy - Trustworthiness Python
TrustPy is a lightweight, framework-agnostic Python library for assessing the reliability, calibration, and uncertainty of predictive models across the AI/ML lifecycle. Designed with MLOps, model validation, and governance in mind, it enables teams to quantify trust before production rollout—ensuring models behave as expected under real-world conditions.
🔧 Works out-of-the-box with any ML framework 📦 Released on Conda-Forge and PyPI 🔁 Maintained with full CI/CD support and test coverage
The implementation is flexible and works out-the-box with any AI/ML library.
Installation
Recommended 1: Install via Conda-Forge
The easiest way to install trustpy-tools is via Conda-Forge, which handles all dependencies automatically. Run the following command:
bash
conda install -c conda-forge trustpy-tools
Recommended 2: Install via PyPI (pip install)
If you prefer using pip (PyPI), you can install directly:
bash
pip install trustpy-tools
Alternative: Manual Installation
If you prefer to install the package manually or are not using Conda, you can install the required dependencies and clone the repository.
Install Dependencies - NumPy: For numerical calculations. - Matplotlib: For plotting the trust spectrum. - Scikit-learn: For Kernel Density Estimation (KDE) in trust density estimation.
Install them via conda:
bash
conda install numpy matplotlib scikit-learn
or
Install them via pip:
bash
pip install numpy matplotlib scikit-learn
Clone the Repository
bash
git clone https://github.com/yaniker/TrustPy.git
cd TrustPy
You can verify installation by running:
bash
python -c "from trustpy import NTS, CNTS; print('TrustPy is ready.')"
Example Usage
```python from trustpy import NTS, CNTS #This is how the package is imported. import numpy as np
Example oracle and predictions
oracle = np.array([0, 0, 1, 2, 2, 0, 1]) # True labels
predictions = np.array([
[0.8, 0.1, 0.1], # Correct, high confidence
[1.0, 0.0, 0.0], # Correct, high confidence
[0.2, 0.7, 0.1], # Correct, high confidence
[0.1, 0.2, 0.7], # Correct, high confidence
[0.1, 0.4, 0.5], # Correct, lower confidence
[0.1, 0.8, 0.1], # Incorrect, high confidence
[0.3, 0.3, 0.4] # Incorrect, low confidence
]
) #Replace this with your model's predictions (predictions = model.predict())
FOR NETTRUSTSCORE
Initialize with default parameters
nts = NTS(oracle, predictions, showsummary=True, exportsummary=True, trustspectrum=True) ntsscores_dict = nts.compute() # Computes trustworthiness for each class and overall.
FOR CONDITIONAL NETTRUSTSCORE
Initialize with default parameters
cnts = CNTS(oracle, predictions, showsummary=True, exportsummary=True, trustspectrum=True) cntsscores_dict = cnts.compute() # Computes trustworthiness for each class and overall.
Sets show_summary=True to print the results table.
Sets export_summary=True to save the results.
Sets trust_spectrum=True to generate plots.
By default, results are saved to:
- trustpy/nts/ (for NTS)
- trustpy/cnts/ (for CNTS)
You can override this using output_dir=your_path
```
Example Plot for Trust Spectrum (trust_spectrum = True)

Example Plot for Conditional Trust Spectrum (trust_spectrum = True)

I shared the codes for the plots Python scripts for plots for users to modify as needed.
Command Line Interface (CLI)
You can run TrustPy directly from the command line after installation. You can also optionally specify a custom output directory. Example:
bash
python -m trustpy --oracle oracle.npy --pred preds.npy --mode cnts --trust_spectrum --output_dir ./my_results
For this you will need your actual/predicted results in oracle.npy and preds.npy format. You can generate test samples via:
```bash
import numpy as np
oracle = np.array([0, 2, 1, 0, 1]) np.save("oracle.npy", oracle)
predictions = np.array([ [0.8, 0.1, 0.1], # correct [0.0, 0.0, 1.0], # correct [0.2, 0.7, 0.1], # correct [0.1, 0.8, 0.1], # wrong [0.3, 0.3, 0.4], # wrong ]) np.save("preds.npy", predictions) ```
Post Installation Testing
You can run this single command to verify that TrustPy runs correctly and can generate trust spectrum plots:
For NTS:
bash
python -c "from trustpy import NTS; import numpy as np; NTS(np.array([0,1,1,0]), np.array([[0.8,0.2],[0.2,0.8],[0.4,0.6],[0.9,0.1]]), trust_spectrum=True, show_summary=False).compute()"
For CNTS:
bash
python -c "from trustpy import CNTS; import numpy as np; CNTS(np.array([0,1,1,0]), np.array([[0.8,0.2],[0.2,0.8],[0.4,0.6],[0.9,0.1]]), trust_spectrum=True, show_summary=False).compute()"
This will generate a test plot and save it to the default output directory:
bash
./trustpy/nts/trust_spectrum.png
./trustpy/cnts/conditional_trust_densities.png
Unit Testing
All unit tests were run using pytest with full coverage prior to release to ensure reliability and correctness.
After installation, you can run all tests to verify everything is working:
bash
python -m pytest tests/
Make sure to install pytest first.
bash
pip install pytest
Licence
This project is licensed under the MIT License. See LICENSE for details.
Citations
For scholarly references and the origins of the techniques used in this package, please refer to the CITATION file.
Owner
- Name: TrustPy
- Login: TrustPy
- Kind: organization
- Email: erimyanik@gmail.com
- Repositories: 1
- Profile: https://github.com/TrustPy
Python package containing code for analyzing the trustworthiness of predictive methods prior to deployment.
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use TrustPy in your work, please cite the following references."
title: TrustPy
version: 2.0.14
date-released: 2025-06-21
url: https://github.com/TrustPy/TrustPy
authors:
- family-names: Yanik
given-names: Erim
affiliation: Florida State University
email: erimyanik@gmail.com
references:
- type: article
authors:
- family-names: Wong
given-names: Alexander
- family-names: Wang
given-names: Xiao Yu
- family-names: Hryniowski
given-names: Andrew
title: How Much Can We Really Trust You? Towards Simple, Interpretable Trust Quantification Metrics for Deep Neural Networks
year: 2020
url: https://arxiv.org/pdf/2009.05835
journal: arXiv preprint arXiv:2009.05835
- type: article
authors:
- family-names: Hryniowski
given-names: Andrew
- family-names: Wang
given-names: Xiao Yu
- family-names: Wong
given-names: Alexander
title: Where Does Trust Break Down? A Quantitative Trust Analysis of Deep Neural Networks via Trust Matrix and Conditional Trust Densities
year: 2020
url: https://arxiv.org/pdf/2009.14701
journal: arXiv preprint arXiv:2009.14701
GitHub Events
Total
- Release event: 4
- Watch event: 2
- Delete event: 1
- Push event: 38
- Pull request event: 2
- Create event: 3
Last Year
- Release event: 4
- Watch event: 2
- Delete event: 1
- Push event: 38
- Pull request event: 2
- Create event: 3
Packages
- Total packages: 1
-
Total downloads:
- pypi 84 last-month
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 13
- Total maintainers: 1
pypi.org: trustpy-tools
Trustworthiness metrics and calibration tools for predictive models
- Homepage: https://github.com/TrustPy/TrustPy
- Documentation: https://trustpy-tools.readthedocs.io/
- License: MIT License
-
Latest release: 2.0.14
published 8 months ago
Rankings
Maintainers (1)
Dependencies
- matplotlib >=3.0
- numpy >=1.20
- scikit-learn >=1.0
- actions/checkout v4 composite
- actions/setup-python v5 composite
- actions/checkout v4 composite
- actions/setup-python v5 composite