https://github.com/autodistill/autodistill-codet

CoDet base model for use with Autodistill.

https://github.com/autodistill/autodistill-codet

Science Score: 13.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.0%) to scientific vocabulary

Keywords

autodistill codet computer-vision object-detection zero-shot-object-detection
Last synced: 5 months ago · JSON representation

Repository

CoDet base model for use with Autodistill.

Basic Info
  • Host: GitHub
  • Owner: autodistill
  • License: apache-2.0
  • Language: Python
  • Default Branch: main
  • Homepage: https://docs.autodistill.com
  • Size: 37.1 KB
Statistics
  • Stars: 3
  • Watchers: 2
  • Forks: 0
  • Open Issues: 1
  • Releases: 0
Topics
autodistill codet computer-vision object-detection zero-shot-object-detection
Created over 2 years ago · Last pushed almost 2 years ago
Metadata Files
Readme License

README.md

Autodistill CoDet Module

This repository contains the code supporting the CoDet base model for use with Autodistill.

CoDet is an open vocabulary zero-shot object detection model. The model was described in the "CoDet: Co-Occurrence Guided Region-Word Alignment for Open-Vocabulary Object Detection" published by Chuofan Ma, Yi Jiang, Xin Wen, Zehuan Yuan, Xiaojuan Qi. The paper was submitted to NeurIPS2023.

Read the full Autodistill documentation.

Read the CoDet Autodistill documentation.

Installation

To use CoDet with autodistill, you need to install the following dependency:

bash pip3 install autodistill-codet

Quickstart

When you first run the model, it will download CoDet and its dependencies, as well as the required model configuration and weights. The output during the download process will be verbose. If you stop the download process before it has finished, run rm -rf ~/.cache/autodistill/CoDet before running the model again. This ensures that you don't work from a part-installed CoDet setup.

When the predict() function runs, the output will also be verbose. You can ignore the output printed to the console that appears when you call predict().

You can only predict classes in the LVIS vocabulary. You can see a list of supported classes in the class_names.json file in the autodistill-codet GitHub repository.

Use the code snippet below to get started:

```python from autodistill_codet import CoDet from autodistill.detection import CaptionOntology from autodistill.utils import plot import cv2

define an ontology to map class names to our CoDet prompt

the ontology dictionary has the format {caption: class}

where caption is the prompt sent to the base model, and class is the label that will

be saved for that caption in the generated annotations

then, load the model

base_model = CoDet( ontology=CaptionOntology( { "person": "person" } ) )

run inference on an image and display the results

class_names is a list of all classes supported by the model

classnames can be used to turn the classid values from the model into human-readable class names

class names is defined in self.class_names

predictions = basemodel.predict("./contextimages/1.jpeg") image = cv2.imread("./context_images/1.jpeg")

plot( image=image, detections=predictions, classes=basemodel.classnames )

run inference on a folder of images and save the results

basemodel.label("./contextimages", extension=".jpeg") ```

License

This project is licensed under an Apache 2.0 license, except where files explicitly note a license.

🏆 Contributing

We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!

Owner

  • Name: Autodistill
  • Login: autodistill
  • Kind: organization
  • Email: autodistill@roboflow.com

Use bigger slower models to train smaller faster ones

GitHub Events

Total
  • Watch event: 2
Last Year
  • Watch event: 2

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 1
  • Total pull requests: 3
  • Average time to close issues: 10 months
  • Average time to close pull requests: 17 days
  • Total issue authors: 1
  • Total pull request authors: 1
  • Average comments per issue: 1.0
  • Average comments per pull request: 0.0
  • Merged pull requests: 2
  • Bot issues: 0
  • Bot pull requests: 3
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • Rivoks (1)
Pull Request Authors
  • dependabot[bot] (3)
Top Labels
Issue Labels
Pull Request Labels
dependencies (3)

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 22 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 3
  • Total maintainers: 1
pypi.org: autodistill-codet

CoDet model for use with Autodistill

  • Versions: 3
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 22 Last month
Rankings
Dependent packages count: 9.6%
Average: 38.8%
Dependent repos count: 68.0%
Maintainers (1)
Last synced: 6 months ago

Dependencies

.github/workflows/publish.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v2 composite
.github/workflows/test.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v2 composite
.github/workflows/welcome.yml actions
  • actions/first-interaction v1.1.1 composite
setup.py pypi
  • Pillow ==10.0.1
  • autodistill *
  • lvis *
  • numpy *
  • supervision *
  • torch *
  • wandb *