tno.sdg.tabular.eval.utility-metrics
TNO PET Lab - Synthetic Data Generation (SDG) - Tabular - Evaluation - Utility Metrics
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (11.9%) to scientific vocabulary
Keywords
Repository
TNO PET Lab - Synthetic Data Generation (SDG) - Tabular - Evaluation - Utility Metrics
Basic Info
- Host: GitHub
- Owner: TNO-SDG
- License: apache-2.0
- Language: Python
- Default Branch: main
- Homepage: https://docs.pet.tno.nl/sdg/tabular/eval/utility_metrics
- Size: 280 KB
Statistics
- Stars: 3
- Watchers: 3
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
TNO PET Lab - Synthetic Data Generation (SDG) - Tabular - Evaluation - Utility Metrics
Extensive evaluation of the utility of synthetic data sets. The original and synthetic data are compared on distinguishability and on a univariate, bivariate and multivariate level. All four metrics are visualized in one plot with a spiderplot. Where one equals 'complete overlap' and zero equals 'no overlap' between original and synthetic data. This plot can depict multiple synthetic data sets. Therefore it can be used to evaluate different levels of privacy protection in synthetic data sets, varying parameter settings in synthetic data generators, or completely different synthetic data generators.
All individual metrics depicted in the spiderplot can be visualized as well. The example_script.py shows you step by step how to generate all visualizations. The main functionalities of the scripts are:
- Univariate distributions: shows the distributions of one variable for the original and synthetic data.
- Bivariate correlations: visualizes a Pearson-r correlation matrix for all variables.
- Multivariate predictions: shows an SVM classifier predicts accuracies for each variable training on either original or synthetic data tested on original data.
- Distinguishability: shows the AUC of a logistic classifier that classifies samples as either original or synthetic.
- Spiderplot: generates spiderplot for these four metrics.
Note that any required pre-processing of the (synthetic) data sets should be done prior. Take into account addressing NANs, missing values, outliers and scaling the data.
For more information on the selected metrics, please refer to the paper (link will be added upon publication) or contact madelon.molhoek@tno.nl. As we aim to keep developing our code feedback and tips are welcome.

PET Lab
The TNO PET Lab consists of generic software components, procedures, and functionalities developed and maintained on a regular basis to facilitate and aid in the development of PET solutions. The lab is a cross-project initiative allowing us to integrate and reuse previously developed PET functionalities to boost the development of new protocols and solutions.
The package tno.sdg.tabular.eval.utility_metrics is part of the TNO Python Toolbox.
Limitations in (end-)use: the content of this software package may solely be used for applications that comply with international export control laws.
This implementation of cryptographic software has not been audited. Use at your own risk.
Documentation
Documentation of the tno.sdg.tabular.eval.utility_metrics package can be found
here.
Install
Easily install the tno.sdg.tabular.eval.utility_metrics package using pip:
console
$ python -m pip install tno.sdg.tabular.eval.utility_metrics
Note: If you are cloning the repository and wish to edit the source code, be sure to install the package in editable mode:
console
$ python -m pip install -e 'tno.sdg.tabular.eval.utility_metrics'
If you wish to run the tests you can use:
console
$ python -m pip install 'tno.sdg.tabular.eval.utility_metrics[tests]'
Usage
See the script in the scripts directory.
Owner
- Name: TNO - PET Lab - Synthetic Data Generation (SDG)
- Login: TNO-SDG
- Kind: organization
- Email: petlab@tno.nl
- Repositories: 1
- Profile: https://github.com/TNO-SDG
Part of TNO PET Lab
Citation (CITATION.cff)
cff-version: 1.2.0
license: Apache-2.0
message: If you use this software, please cite it using these metadata.
authors:
- name: TNO PET Lab
city: The Hague
country: NL
email: petlab@tno.nl
website: https://pet.tno.nl
type: software
url: https://pet.tno.nl
contact:
- name: TNO PET Lab
city: The Hague
country: NL
email: petlab@tno.nl
website: https://pet.tno.nl
repository-code: https://github.com/TNO-SDG/tabular.eval.utility_metrics
repository-artifact: https://pypi.org/project/tno.sdg.tabular.eval.utility_metrics
title: TNO PET Lab - Synthetic Data Generation (SDG) - Tabular - Evaluation - Utility Metrics
version: 0.4.1
date-released: 2024-12-10
GitHub Events
Total
- Watch event: 1
- Push event: 2
- Create event: 1
Last Year
- Watch event: 1
- Push event: 2
- Create event: 1
Packages
- Total packages: 1
-
Total downloads:
- pypi 42 last-month
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 3
- Total maintainers: 1
pypi.org: tno.sdg.tabular.eval.utility-metrics
Utility metrics for tabular data
- Homepage: https://pet.tno.nl/
- Documentation: https://docs.pet.tno.nl/sdg/tabular/eval/utility_metrics/0.4.1
- License: Apache License, Version 2.0
-
Latest release: 0.4.1
published about 1 year ago
Rankings
Maintainers (1)
Dependencies
- matplotlib *
- numpy *
- pandas *
- pandas-stubs *
- plotly *
- scikit-learn *
- seaborn *
- statsmodels *