Recent Releases of open-clip-torch
open-clip-torch - v3.1.0
What's Changed
- Add support for MetaCLIP2 WorldWide models by @rwightman in https://github.com/mlfoundations/open_clip/pull/1100
- Fix mask for CoCa generate by @rwightman in https://github.com/mlfoundations/open_clip/pull/1103
- Add a text locking impl that works across CustomCLIP and CLIP. by @rwightman in https://github.com/mlfoundations/open_clip/pull/1104
Full Changelog: https://github.com/mlfoundations/open_clip/compare/v3.0.0...v3.1.0
- Python
Published by github-actions[bot] 7 months ago
open-clip-torch - v3.0.0
What's Changed
- Initial work on adding local-dir: schema for model & tokenizer loading from local folder by @rwightman in https://github.com/mlfoundations/open_clip/pull/1069
- Add --force-context-length argument by @rwightman in https://github.com/mlfoundations/open_clip/pull/1080
- fix CustomTextCLIP.noweightdecay by @thelaao in https://github.com/mlfoundations/open_clip/pull/1082
- Fix an issue where CustomText models not gettin proper pos embed interpolation by @rwightman in https://github.com/mlfoundations/open_clip/pull/1085
- Removing redundent checks by @shreyaskamathkm in https://github.com/mlfoundations/open_clip/pull/1096
- PE Core by @rwightman in https://github.com/mlfoundations/open_clip/pull/1097
- Alternative tokenizer support for CLIPS by @rwightman in https://github.com/mlfoundations/open_clip/pull/1081
- Wire up custom attention block via config by @rwightman in https://github.com/mlfoundations/open_clip/pull/1086
- An alternative text masking helper fn by @rwightman in https://github.com/mlfoundations/open_clip/pull/1084
- Update min reqs, test on python 3.10 by @rwightman in https://github.com/mlfoundations/open_clip/pull/1098
- Resize timm image encoders by @rwightman in https://github.com/mlfoundations/open_clip/pull/1099
New Contributors
- @thelaao made their first contribution in https://github.com/mlfoundations/open_clip/pull/1082
- @shreyaskamathkm made their first contribution in https://github.com/mlfoundations/open_clip/pull/1096
Full Changelog: https://github.com/mlfoundations/open_clip/compare/v2.32.0...v3.0.0
- Python
Published by github-actions[bot] 7 months ago
open-clip-torch - v2.30.0
What's Changed
- Support using timm optimizers for alternative to adamw default by @rwightman in https://github.com/mlfoundations/open_clip/pull/979
- add missing ViTamin configs by @xywei00 in https://github.com/mlfoundations/open_clip/pull/978
- Experimenting with alternative siglip loss impl for better dist scaling by @rwightman in https://github.com/mlfoundations/open_clip/pull/971
New Contributors
- @xywei00 made their first contribution in https://github.com/mlfoundations/open_clip/pull/978
Full Changelog: https://github.com/mlfoundations/open_clip/compare/v2.29.0...v2.30.0
- Python
Published by github-actions[bot] about 1 year ago
open-clip-torch - v2.29.0
What's Changed
- All default pretrained weights pushed to HF hub by @rwightman in https://github.com/mlfoundations/open_clip/pull/970
Full Changelog: https://github.com/mlfoundations/open_clip/compare/v2.28.0...v2.29.0
- Python
Published by github-actions[bot] over 1 year ago
open-clip-torch - Pretrained Weights
This release tag is being used to host weights for various models trained with this codebase.
NOTE: The one included metric, zero-shot top-1 on ImageNet-1k does capture the full characteristics of the given pretrained weights. Evaluation on a broader set of zero-shot and validation tasks is required for a full comparison.
| model | dataset | weights | In1k zero-shot top-1 | | --- | --- | --- | --- | | RN50 | CC12M | rn50-quickgelu-cc12m | 36.45 | | RN50 | YFCC15M | rn50-quickgelu-yfcc15m | 32.73| | RN101 |YFCC15M | rn101-quickgelu-yfcc15m | 34.86 | | ViT-B-32 | LAION-400M | vitb32-quickgelu-laion400me31 | 62.96 | | ViT-B-32 | LAION-400M | vitb32-quickgelu-laion400me32 | 62.94 | | ViT-B-32 | LAION-2B | vitb32-laion2be16 | 65.62 | | ViT-B-16 | LAION-400M | vitb16-laion400me31 | 66.98 | | ViT-B-16 | LAION-400M | vitb16-laion400me32 | 67.07 | | ViT-B-16-plus-240 | LAION-400M | vitb16-laion400me31 | 69.06 | | ViT-B-16-plus-240 | LAION-400M | vitb16-laion400me32 | 69.21 | | ViT-L-14 | LAION-400M | vitb14-laion400me31 | 72.70 | | ViT-L-14 | LAION-400M | vitb14-laion400m_e32 | 72.77 |
- Python
Published by rwightman almost 4 years ago
open-clip-torch - Initial release.
Welcome to the initial release of open_clip, an open source implementation of OpenAI's CLIP (Contrastive Language-Image Pre-training).
The goal of this repository is to enable training models with contrastive image-text supervision, and to investigate their properties such as robustness to distribution shift. Our starting point is an implementation of CLIP that matches the accuracy of the original CLIP models when trained on the same dataset.
- Python
Published by mitchellnw over 4 years ago