https://github.com/autodistill/autodistill-vit
ViT module for use with autodistill.
Science Score: 13.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.5%) to scientific vocabulary
Keywords
Repository
ViT module for use with autodistill.
Basic Info
- Host: GitHub
- Owner: autodistill
- License: apache-2.0
- Language: Python
- Default Branch: main
- Homepage: https://docs.autodistill.com
- Size: 11.7 KB
Statistics
- Stars: 5
- Watchers: 2
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
Autodistill ViT Module
This repository contains the code supporting the ViT target model for use with Autodistill.
ViT is a classification model pre-trained on ImageNet-21k, developed by Google. You can train ViT classification models using Autodistill.
Read the full Autodistill documentation.
Read the ViT Autodistill documentation.
Installation
To use the ViT target model, you will need to install the following dependency:
bash
pip3 install autodistill-vit
Quickstart
```python from autodistill_vit import ViT
target_model = ViT()
train a model from a classification folder structure
targetmodel.train("./contextimages_labeled/", epochs=200)
run inference on the new model
pred = targetmodel.predict("./contextimages_labeled/train/images/dog-7.jpg", conf=0.01) ```
License
The code in this repository is licensed under an Apache 2.0 license.
🏆 Contributing
We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!
Owner
- Name: Autodistill
- Login: autodistill
- Kind: organization
- Email: autodistill@roboflow.com
- Website: https://autodistill.com
- Repositories: 1
- Profile: https://github.com/autodistill
Use bigger slower models to train smaller faster ones
GitHub Events
Total
Last Year
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 0
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 33 last-month
- Total dependent packages: 1
- Total dependent repositories: 0
- Total versions: 2
- Total maintainers: 1
pypi.org: autodistill-vit
ViT module for use with Autodistill
- Homepage: https://github.com/autodistill/autodistill-vit
- Documentation: https://autodistill-vit.readthedocs.io/
- License: MIT License
-
Latest release: 0.1.0
published over 2 years ago
Rankings
Maintainers (1)
Dependencies
- actions/checkout v3 composite
- actions/setup-python v4 composite
- pypa/gh-action-pypi-publish release/v1 composite
- actions/checkout v3 composite
- actions/setup-python v2 composite
- actions/checkout v3 composite
- actions/setup-python v2 composite
- actions/first-interaction v1.1.1 composite
- Pillow *
- autodistill *
- numpy *
- supervision ==0.9.0
- torch *
- transformers *