treeoflife-toolbox

Source-specific tools for processing data (images) downloaded using distributed downloader and relies on MPI.

https://github.com/imageomics/treeoflife-toolbox

Science Score: 75.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 6 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org, nature.com
  • Academic email domains
  • Institutional organization owner
    Organization imageomics has institutional domain (imageomics.osu.edu)
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.3%) to scientific vocabulary

Keywords

animal-detection citizen-science dataset-processing face-detection image-filtering imageomics specimen-records tree-of-life
Last synced: 4 months ago · JSON representation ·

Repository

Source-specific tools for processing data (images) downloaded using distributed downloader and relies on MPI.

Basic Info
Statistics
  • Stars: 1
  • Watchers: 7
  • Forks: 0
  • Open Issues: 28
  • Releases: 0
Topics
animal-detection citizen-science dataset-processing face-detection image-filtering imageomics specimen-records tree-of-life
Created 11 months ago · Last pushed 6 months ago
Metadata Files
Readme License Citation

README.md

Tools for TreeOfLife dataset

This repository contains tools used in creating the TreeOfLife-200M dataset. They were created on the basis of distributed-downloader, which was used for downloading all the images. Step-by-step instructions to download all of the images are provided in docs/README.

Installation Instructions

Currently, only the portion of this package that is required for downloading all the images in the TreeOfLife-200M dataset is installable. Our other processing code, which is not required to download a copy of the dataset, is provided and described further in the processing/ directory.[^1] [^1]: This processing code will be reworked into installable modules as appropriate over the coming months.

Pip installation

  1. Install Python 3.10 or 3.11
  2. Install MPI, any MPI should work, tested with OpenMPI and IntelMPI. Installation instructions can be found on official websites:
  3. Install the required package:

    • For development:

    commandline pip install -e .[dev]

Scripts creation

After installation, you need to create scripts for the tools. Follow this instructions here

Currently, the following tools are available

  • columnnamechange - changes the names of columns in the dataset
  • columnnamechangelilafix - changes the names of columns in the dataset (were created to fix the bug in Lila BC dataset)
  • data_merging - used to filter out duplicated data in freshly downloaded datasets from existing ones (deduplication based on hashsum)
  • data_transfer - transfers data from one downloaded dataset to TreeOfLife dataset
  • eol_rename - were used to change source_id from EOL content ID to "EOL content ID_ EOL page ID" (the change was discarded later)
  • fathomnetcrop - used to crop FathomNet images to the bounding box sizes
  • fathomnetcrop_fix - used to crop FathomNet images to the bounding box sizes (were created to fix the bug in FathomNet dataset)
  • filteroutby_uuid - tool to filter out using table of Tree of life uuids
  • lilabcfiltering - tool for filtering Lila BC dataset (based on some processed csv table)
  • lilaextranoaa_processing - tool for processing Lila Extra NOAA dataset in TreeOfLife format
  • lilaseparationmultilable_filtering - tool to extract multilabels data from Lila BC dataset and duplicate images for each label
  • lilaseparationsinglelabelfiltering - tool to extract single label data from Lila BC dataset
  • mamanspfix - tool to fix the bug in man ansp server (gbif source)
  • research_filtering - tool to filter out data from TreeOfLife datasets
  • transferandtype_change - tool to transfer data (and change types) from one place to another on research storage (it transfers only 10Tb per day, to not overload the back-up system)
  • tol200mbioscandata_tranfer - tool to transfer data from Bioscan dataset to TreeOfLife dataset
  • tol200mfathomnet_crop - tool to crop FathomNet images to the bounding box sizes for the TreeOfLife dataset

How to use the tools

To use the tools, you will need to create a config.yaml file, schema can be found in example.yaml.

To run the tool, use the following command:

bash tree_of_life_toolbox <config_path> <tool_name> [--OPTIONS]

  • <config_path> - path to the config.yaml file (either absolute or relative)
  • <tool_name> - name of the tool to run
  • [--OPTIONS] - optional arguments for the tool:
    • --reset_filtering - basically a full reset. It resets the first step of the tool - filtering, however, since all the following steps depend on the filtering step, it will reset them as well
    • --reset_scheduling - resets the scheduling step (useful when you want to change the number of runners/nodes per runner)
    • --reset_runners - resets the runners, meaning they will start from scratch
    • --tool_name_override - used to disable the tool name check

How to create a new tool

To create a new tool, you will need to create a new folder in src/TreeOfLife_toolbox/ and add the following files:

  • __init__.py - empty file
  • classes.py - file with the classes for the tool

In classes.py you will need to

  1. Create a class for each step of the tool (filtering, scheduling, runner). Make sure that the class inherits from the base class for the step and that class names are unique.
  2. Register the classes with their respective registry (FilterRegistery, SchedulerRegistry, RunnerRegistry) using the register decorator.
  3. Add tool folder to __init__.py file in src/TreeOfLife_toolbox/ folder.

The following base classes are available:

  • filtering step:
    • FilterToolBase - bare minimum class for the filtering step
    • SparkFilterToolBase - base class for the filtering step using Spark, it automatically creates a Spark session and has some additional methods for working with Spark
    • PythonFilterToolBase - base class for the filtering step using Python, it can automatically traverse the * downloaded* dataset
  • scheduling step:
    • SchedulerToolBase - bare minimum class for the scheduling step
    • DefaultScheduler - base class for the scheduling step. It can perform "standard" scheduling for the runners, you will need to specify the schema for it.
  • runner step:
    • RunnerToolBase - bare minimum class for the runner step
    • MPIRunnerTool - base class for the MPI based runner step, it can automatically initialize the MPI environment, read the schedule and call the apply_filter method on the separate chunks from schedule sequentially. You will need to implement the apply_filter method in your class.
    • FilterRunnerTool - inherits from MPIRunnerTool and can perform "standard" filtering based on UUIDs. Works only with downloaded dataset schema.

Recommended Citation

If using the TreeOfLife-200M dataset, please cite this repo, the dataset, and our paper.

@software{Kopanev_TreeOfLife-toolbox_2025, author = {Kopanev, Andrei and Zhang, Net and Gu, Jianyang and Stevens, Samuel and Thompson, Matthew J and Campolongo, Elizabeth G}, license = {MIT}, month = may, title = {{TreeOfLife-toolbox}}, url = {https://github.com/Imageomics/TreeOfLife-toolbox}, version = {0.2.0-beta}, year = {2025} }

@dataset{treeoflife_200m, title = {{T}ree{O}f{L}ife-200{M}}, author = {Jianyang Gu and Samuel Stevens and Elizabeth G Campolongo and Matthew J Thompson and Net Zhang and Jiaman Wu and Andrei Kopanev and Zheda Mai and Alexander E. White and James Balhoff and Wasila M Dahdul and Daniel Rubenstein and Hilmar Lapp and Tanya Berger-Wolf and Wei-Lun Chao and Yu Su}, year = {2025}, url = {https://huggingface.co/datasets/imageomics/TreeOfLife-200M}, doi = {}, publisher = {Hugging Face} }

@article{gu2025bioclip, title = {{B}io{CLIP} 2: Emergent Properties from Scaling Hierarchical Contrastive Learning}, author = {Jianyang Gu and Samuel Stevens and Elizabeth G Campolongo and Matthew J Thompson and Net Zhang and Jiaman Wu and Andrei Kopanev and Zheda Mai and Alexander E. White and James Balhoff and Wasila M Dahdul and Daniel Rubenstein and Hilmar Lapp and Tanya Berger-Wolf and Wei-Lun Chao and Yu Su}, year = {2025}, eprint={2505.23883}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2505.23883}, }

Also consider citing GBIF, BIOSCAN-5M, EOL, and FathomNet:

``` @misc{GBIF, title = {{GBIF} Occurrence Download}, author = {GBIF.org}, doi = {10.15468/DL.BFV433}, url = {https://doi.org/10.15468/dl.bfv433}, keywords = {GBIF, biodiversity, species occurrences}, publisher = {The Global Biodiversity Information Facility}, month = {May}, year = {2024}, copyright = {Creative Commons Attribution Non Commercial 4.0 International} }

```

@inproceedings{gharaee2024bioscan5m, title={{BIOSCAN-5M}: A Multimodal Dataset for Insect Biodiversity}, author={Zahra Gharaee and Scott C. Lowe and ZeMing Gong and Pablo Millan Arias and Nicholas Pellegrino and Austin T. Wang and Joakim Bruslund Haurum and Iuliia Zarubiieva and Lila Kari and Dirk Steinke and Graham W. Taylor and Paul Fieguth and Angel X. Chang }, booktitle={NeurIPS}, editor={A. Globerson and L. Mackey and D. Belgrave and A. Fan and U. Paquet and J. Tomczak and C. Zhang}, pages={36285--36313}, publisher={Curran Associates, Inc.}, year={2024}, volume={37}, }

@misc{eol, author = {{Encyclopedia of Life (EOL)}}, url = {https://eol.org}, note = {Accessed August 2024} }

``` @article{katijafathomnet2022, title = {{FathomNet}: {A} global image database for enabling artificial intelligence in the ocean}, author = {Katija, Kakani and Orenstein, Eric and Schlining, Brian and Lundsten, Lonny and Barnard, Kevin and Sainz, Giovanna and Boulais, Oceane and Cromwell, Megan and Butler, Erin and Woodward, Benjamin and Bell, Katherine L. C.}, journal = {Scientific Reports}, volume = {12}, number = {1}, pages = {15914}, issn = {2045-2322}, shorttitle = {{FathomNet}}, url = {https://www.nature.com/articles/s41598-022-19939-2}, doi = {10.1038/s41598-022-19939-2}, month = sep, year = {2022}, }

Owner

  • Name: Imageomics Institute
  • Login: Imageomics
  • Kind: organization

Citation (CITATION.cff)

cff-version: 1.2.0
title: TreeOfLife-toolbox
message: >-
  If you use this software, please cite it using the
  metadata from this file.
type: software
authors:
  - given-names: Andrei
    family-names: Kopanev
  - given-names: Net
    family-names: Zhang
    orcid: 'https://orcid.org/0000-0003-2664-451X'
  - given-names: Jianyang
    family-names: Gu
    orcid: 'https://orcid.org/0000-0002-4060-7427'
  - given-names: Samuel
    family-names: Stevens
    orcid: 'https://orcid.org/0009-0000-9493-7766'
  - given-names: 'Matthew J'
    family-names: Thompson
    orcid: 'https://orcid.org/0000-0003-0583-8585'
  - given-names: 'Elizabeth G'
    family-names: Campolongo
    orcid: 'https://orcid.org/0000-0003-0846-2413'
repository-code: 'https://github.com/Imageomics/TreeOfLife-toolbox'
identifiers:
  - description: "The GitHub release URL of tag v0.2.0-beta."
    type: url
    value: "https://github.com/Imageomics/TreeOfLife-toolbox/releases/tag/v0.2.0-beta"
  - description: "The GitHub URL of the commit tagged with v0.2.0-beta."
    type: url
    value: "https://github.com/Imageomics/TreeOfLife-toolbox/tree/<commit-hash>" # update post release
abstract: >-
  A tool for processing datasets that were downloaded using the distributed-downloader package, 
  specifically used for constructing the TreeOfLife-200M dataset.
keywords:
  - parallel
  - distributed
  - processing
  - url
  - imageomics
  - "dataset generation"
  - "MPI application"
license: MIT
version: 0.2.0-beta
date-released: '2025' # update on release

GitHub Events

Total
  • Issues event: 1
  • Issue comment event: 3
  • Push event: 2
  • Public event: 1
  • Pull request review event: 3
  • Pull request event: 5
  • Create event: 4
Last Year
  • Issues event: 1
  • Issue comment event: 3
  • Push event: 2
  • Public event: 1
  • Pull request review event: 3
  • Pull request event: 5
  • Create event: 4

Issues and Pull Requests

Last synced: 4 months ago

All Time
  • Total issues: 11
  • Total pull requests: 28
  • Average time to close issues: 11 days
  • Average time to close pull requests: 17 days
  • Total issue authors: 3
  • Total pull request authors: 3
  • Average comments per issue: 0.27
  • Average comments per pull request: 0.25
  • Merged pull requests: 7
  • Bot issues: 0
  • Bot pull requests: 7
Past Year
  • Issues: 11
  • Pull requests: 28
  • Average time to close issues: 11 days
  • Average time to close pull requests: 17 days
  • Issue authors: 3
  • Pull request authors: 3
  • Average comments per issue: 0.27
  • Average comments per pull request: 0.25
  • Merged pull requests: 7
  • Bot issues: 0
  • Bot pull requests: 7
Top Authors
Issue Authors
  • egrace479 (9)
  • tms2003 (1)
  • johnbradley (1)
Pull Request Authors
  • Andrey170170 (17)
  • dependabot[bot] (7)
  • egrace479 (4)
Top Labels
Issue Labels
documentation (7) structure (3) design (1)
Pull Request Labels
dependencies (7) python (7) documentation (1) enhancement (1)

Dependencies

processing/content_dedup/pyproject.toml pypi
  • beartype >=0.20.0
  • bioscan-dataset >=1.2.1
  • filprofiler >=2024.11.2
  • jaxtyping >=0.2.38
  • marimo >=0.12.8
  • matplotlib >=3.10.1
  • numpy >=2.2.4
  • opencv-python >=4.11.0.86
  • pdqhash >=0.2.7
  • polars >=1.27.1
  • pyarrow >=19.0.1
  • pyinstrument >=5.0.1
  • pytest-benchmark >=5.1.0
  • setuptools >=76.0.0
  • submitit >=1.5.2
  • torch >=2.6.0
  • tyro >=0.9.17
  • wids >=0.1.11
  • wilds >=2.0.0
processing/docs/requirements_batch_camera_trap.txt pypi
  • absl-py ==2.2.2
  • aiofiles ==24.1.0
  • annotated-types ==0.7.0
  • anyio ==4.9.0
  • boto3 ==1.38.12
  • botocore ==1.38.12
  • certifi ==2025.4.26
  • chardet ==5.2.0
  • charset-normalizer ==3.4.2
  • click ==8.1.8
  • contourpy ==1.3.2
  • cycler ==0.12.1
  • defusedxml ==0.7.1
  • fastapi ==0.115.12
  • ffmpy ==0.5.0
  • filelock ==3.18.0
  • filetype ==1.2.0
  • fire ==0.7.0
  • fonttools ==4.57.0
  • fsspec ==2025.3.2
  • gitdb ==4.0.12
  • gitpython ==3.1.44
  • gradio ==5.29.0
  • gradio-client ==1.10.0
  • groovy ==0.1.2
  • grpcio ==1.71.0
  • h11 ==0.16.0
  • hf-xet ==1.1.0
  • httpcore ==1.0.9
  • httpx ==0.28.1
  • huggingface-hub ==0.31.1
  • idna ==3.7
  • jinja2 ==3.1.6
  • jmespath ==1.0.1
  • joblib ==1.5.0
  • kiwisolver ==1.4.8
  • markdown ==3.8
  • markdown-it-py ==3.0.0
  • markupsafe ==3.0.2
  • matplotlib ==3.10.3
  • mdurl ==0.1.2
  • mpmath ==1.3.0
  • networkx ==3.4.2
  • numpy ==2.2.5
  • nvidia-cublas-cu12 ==12.6.4.1
  • nvidia-cuda-cupti-cu12 ==12.6.80
  • nvidia-cuda-nvrtc-cu12 ==12.6.77
  • nvidia-cuda-runtime-cu12 ==12.6.77
  • nvidia-cudnn-cu12 ==9.5.1.17
  • nvidia-cufft-cu12 ==11.3.0.4
  • nvidia-cufile-cu12 ==1.11.1.6
  • nvidia-curand-cu12 ==10.3.7.77
  • nvidia-cusolver-cu12 ==11.7.1.2
  • nvidia-cusparse-cu12 ==12.5.4.2
  • nvidia-cusparselt-cu12 ==0.6.3
  • nvidia-nccl-cu12 ==2.26.2
  • nvidia-nvjitlink-cu12 ==12.6.85
  • nvidia-nvtx-cu12 ==12.6.77
  • opencv-python ==4.11.0.86
  • opencv-python-headless ==4.10.0.84
  • orjson ==3.10.18
  • packaging ==25.0
  • pandas ==2.2.3
  • pillow ==11.2.1
  • pillow-heif ==0.22.0
  • protobuf ==6.30.2
  • psutil ==7.0.0
  • py-cpuinfo ==9.0.0
  • py4j ==0.10.9.7
  • pyarrow ==20.0.0
  • pybboxes ==0.1.6
  • pydantic ==2.11.4
  • pydantic-core ==2.33.2
  • pydub ==0.25.1
  • pygments ==2.19.1
  • pyparsing ==3.2.3
  • pyspark ==3.5.5
  • python-dateutil ==2.9.0.post0
  • python-dotenv ==1.1.0
  • python-multipart ==0.0.20
  • pytz ==2025.2
  • pyyaml ==6.0.2
  • requests ==2.32.3
  • requests-toolbelt ==1.0.0
  • rich ==14.0.0
  • roboflow ==1.1.63
  • ruff ==0.11.8
  • s3transfer ==0.12.0
  • safehttpx ==0.1.6
  • safetensors ==0.5.3
  • sahi ==0.11.14
  • scikit-learn ==1.6.1
  • scipy ==1.15.3
  • seaborn ==0.13.2
  • semantic-version ==2.10.0
  • setuptools ==80.3.1
  • shapely ==2.1.0
  • shellingham ==1.5.4
  • six ==1.17.0
  • smmap ==5.0.2
  • sniffio ==1.3.1
  • starlette ==0.46.2
  • supervision ==0.23.0
  • sympy ==1.14.0
  • tensorboard ==2.19.0
  • tensorboard-data-server ==0.7.2
  • termcolor ==3.1.0
  • terminaltables ==3.1.10
  • thop ==0.1.1.post2209072238
  • threadpoolctl ==3.6.0
  • timm ==1.0.15
  • tomlkit ==0.13.2
  • torch ==2.7.0
  • torchaudio ==2.7.0
  • torchvision ==0.22.0
  • tqdm ==4.67.1
  • triton ==3.3.0
  • typer ==0.15.3
  • typing-extensions ==4.13.2
  • typing-inspection ==0.4.0
  • tzdata ==2025.2
  • ultralytics ==8.3.129
  • ultralytics-thop ==2.0.14
  • urllib3 ==2.4.0
  • uvicorn ==0.34.2
  • websockets ==15.0.1
  • werkzeug ==3.1.3
  • wget ==3.2
  • yolov5 ==7.0.13
processing/docs/requirements_batch_clip.txt pypi
  • filelock ==3.18.0
  • fsspec ==2025.3.2
  • ftfy ==6.3.1
  • jinja2 ==3.1.6
  • markupsafe ==3.0.2
  • mpmath ==1.3.0
  • networkx ==3.4.2
  • numpy ==2.2.5
  • nvidia-cublas-cu12 ==12.6.4.1
  • nvidia-cuda-cupti-cu12 ==12.6.80
  • nvidia-cuda-nvrtc-cu12 ==12.6.77
  • nvidia-cuda-runtime-cu12 ==12.6.77
  • nvidia-cudnn-cu12 ==9.5.1.17
  • nvidia-cufft-cu12 ==11.3.0.4
  • nvidia-cufile-cu12 ==1.11.1.6
  • nvidia-curand-cu12 ==10.3.7.77
  • nvidia-cusolver-cu12 ==11.7.1.2
  • nvidia-cusparse-cu12 ==12.5.4.2
  • nvidia-cusparselt-cu12 ==0.6.3
  • nvidia-nccl-cu12 ==2.26.2
  • nvidia-nvjitlink-cu12 ==12.6.85
  • nvidia-nvtx-cu12 ==12.6.77
  • packaging ==25.0
  • pandas ==2.2.3
  • pillow ==11.2.1
  • py4j ==0.10.9.7
  • pyarrow ==20.0.0
  • pyspark ==3.5.5
  • python-dateutil ==2.9.0.post0
  • pytz ==2025.2
  • regex ==2024.11.6
  • setuptools ==80.3.1
  • six ==1.17.0
  • sympy ==1.14.0
  • torch ==2.7.0
  • torchaudio ==2.7.0
  • torchvision ==0.22.0
  • tqdm ==4.67.1
  • triton ==3.3.0
  • typing-extensions ==4.13.2
  • tzdata ==2025.2
  • wcwidth ==0.2.13
processing/docs/requirements_batch_face_detection.txt pypi
  • certifi ==2025.4.26
  • charset-normalizer ==3.4.2
  • facenet-pytorch ==2.5.3
  • filelock ==3.18.0
  • fsspec ==2025.3.2
  • idna ==3.10
  • jinja2 ==3.1.6
  • markupsafe ==3.0.2
  • mpmath ==1.3.0
  • networkx ==3.4.2
  • numpy ==2.2.5
  • nvidia-cublas-cu12 ==12.6.4.1
  • nvidia-cuda-cupti-cu12 ==12.6.80
  • nvidia-cuda-nvrtc-cu12 ==12.6.77
  • nvidia-cuda-runtime-cu12 ==12.6.77
  • nvidia-cudnn-cu12 ==9.5.1.17
  • nvidia-cufft-cu12 ==11.3.0.4
  • nvidia-cufile-cu12 ==1.11.1.6
  • nvidia-curand-cu12 ==10.3.7.77
  • nvidia-cusolver-cu12 ==11.7.1.2
  • nvidia-cusparse-cu12 ==12.5.4.2
  • nvidia-cusparselt-cu12 ==0.6.3
  • nvidia-nccl-cu12 ==2.26.2
  • nvidia-nvjitlink-cu12 ==12.6.85
  • nvidia-nvtx-cu12 ==12.6.77
  • pandas ==2.2.3
  • pillow ==11.2.1
  • py4j ==0.10.9.7
  • pyarrow ==20.0.0
  • pyspark ==3.5.5
  • python-dateutil ==2.9.0.post0
  • pytz ==2025.2
  • requests ==2.32.3
  • setuptools ==80.3.1
  • six ==1.17.0
  • sympy ==1.14.0
  • torch ==2.7.0
  • torchaudio ==2.7.0
  • torchvision ==0.22.0
  • triton ==3.3.0
  • typing-extensions ==4.13.2
  • tzdata ==2025.2
  • urllib3 ==2.4.0
processing/docs/requirements_batch_processing.txt pypi
  • absl-py ==2.2.2
  • aiofiles ==24.1.0
  • annotated-types ==0.7.0
  • anyio ==4.9.0
  • boto3 ==1.38.12
  • botocore ==1.38.12
  • certifi ==2025.4.26
  • chardet ==5.2.0
  • charset-normalizer ==3.4.2
  • click ==8.1.8
  • contourpy ==1.3.2
  • cycler ==0.12.1
  • defusedxml ==0.7.1
  • facenet-pytorch ==2.5.3
  • fastapi ==0.115.12
  • ffmpy ==0.5.0
  • filelock ==3.18.0
  • filetype ==1.2.0
  • fire ==0.7.0
  • fonttools ==4.57.0
  • fsspec ==2025.3.2
  • ftfy ==6.3.1
  • gitdb ==4.0.12
  • gitpython ==3.1.44
  • gradio ==5.29.0
  • gradio-client ==1.10.0
  • groovy ==0.1.2
  • grpcio ==1.71.0
  • h11 ==0.16.0
  • hf-xet ==1.1.0
  • httpcore ==1.0.9
  • httpx ==0.28.1
  • huggingface-hub ==0.31.1
  • idna ==3.7
  • jinja2 ==3.1.6
  • jmespath ==1.0.1
  • joblib ==1.5.0
  • kiwisolver ==1.4.8
  • markdown ==3.8
  • markdown-it-py ==3.0.0
  • markupsafe ==3.0.2
  • matplotlib ==3.10.3
  • mdurl ==0.1.2
  • mpmath ==1.3.0
  • networkx ==3.4.2
  • numpy ==2.2.5
  • nvidia-cublas-cu12 ==12.6.4.1
  • nvidia-cuda-cupti-cu12 ==12.6.80
  • nvidia-cuda-nvrtc-cu12 ==12.6.77
  • nvidia-cuda-runtime-cu12 ==12.6.77
  • nvidia-cudnn-cu12 ==9.5.1.17
  • nvidia-cufft-cu12 ==11.3.0.4
  • nvidia-cufile-cu12 ==1.11.1.6
  • nvidia-curand-cu12 ==10.3.7.77
  • nvidia-cusolver-cu12 ==11.7.1.2
  • nvidia-cusparse-cu12 ==12.5.4.2
  • nvidia-cusparselt-cu12 ==0.6.3
  • nvidia-nccl-cu12 ==2.26.2
  • nvidia-nvjitlink-cu12 ==12.6.85
  • nvidia-nvtx-cu12 ==12.6.77
  • opencv-python ==4.11.0.86
  • opencv-python-headless ==4.10.0.84
  • orjson ==3.10.18
  • packaging ==25.0
  • pandas ==2.2.3
  • pillow ==11.2.1
  • pillow-heif ==0.22.0
  • protobuf ==6.30.2
  • psutil ==7.0.0
  • py-cpuinfo ==9.0.0
  • py4j ==0.10.9.7
  • pyarrow ==20.0.0
  • pybboxes ==0.1.6
  • pydantic ==2.11.4
  • pydantic-core ==2.33.2
  • pydub ==0.25.1
  • pygments ==2.19.1
  • pyparsing ==3.2.3
  • pyspark ==3.5.5
  • python-dateutil ==2.9.0.post0
  • python-dotenv ==1.1.0
  • python-multipart ==0.0.20
  • pytz ==2025.2
  • pyyaml ==6.0.2
  • regex ==2024.11.6
  • requests ==2.32.3
  • requests-toolbelt ==1.0.0
  • rich ==14.0.0
  • roboflow ==1.1.63
  • ruff ==0.11.8
  • s3transfer ==0.12.0
  • safehttpx ==0.1.6
  • safetensors ==0.5.3
  • sahi ==0.11.14
  • scikit-learn ==1.6.1
  • scipy ==1.15.3
  • seaborn ==0.13.2
  • semantic-version ==2.10.0
  • setuptools ==80.3.1
  • shapely ==2.1.0
  • shellingham ==1.5.4
  • six ==1.17.0
  • smmap ==5.0.2
  • sniffio ==1.3.1
  • starlette ==0.46.2
  • supervision ==0.23.0
  • sympy ==1.14.0
  • tensorboard ==2.19.0
  • tensorboard-data-server ==0.7.2
  • termcolor ==3.1.0
  • terminaltables ==3.1.10
  • thop ==0.1.1.post2209072238
  • threadpoolctl ==3.6.0
  • timm ==1.0.15
  • tomlkit ==0.13.2
  • torch ==2.7.0
  • torchaudio ==2.7.0
  • torchvision ==0.22.0
  • tqdm ==4.67.1
  • triton ==3.3.0
  • typer ==0.15.3
  • typing-extensions ==4.13.2
  • typing-inspection ==0.4.0
  • tzdata ==2025.2
  • ultralytics ==8.3.129
  • ultralytics-thop ==2.0.14
  • urllib3 ==2.4.0
  • uvicorn ==0.34.2
  • wcwidth ==0.2.13
  • websockets ==15.0.1
  • werkzeug ==3.1.3
  • wget ==3.2
  • yolov5 ==7.0.13
processing/requirements_tol2webdataset.txt pypi
  • Pillow *
  • numpy *
  • opencv-python *
  • pandas *
  • polars *
  • tqdm *
  • webdataset *
pyproject.toml pypi
  • attrs *
  • brotli *
  • cramjam *
  • cython *
  • fsspec *
  • inflate64 *
  • mpi4py *
  • multivolumefile *
  • opencv-python *
  • pandas *
  • pathspec *
  • pillow *
  • psutil *
  • pyarrow *
  • pybcj *
  • pycryptodomex *
  • pyppmd *
  • pyspark *
  • python-dotenv *
  • pyyaml *
  • pyzstd *
  • requests *
  • setuptools *
  • texttable *
  • trove-classifiers *
  • typing-extensions *
  • wheel *