Recent Releases of pylocron
pylocron - v0.2.1: Rebranded project architecture and bug fixes
This patch release improves project quality while fixing several bugs.
Note: holocron 0.2.1 requires PyTorch 1.9.1 and torchvision 0.10.1 or higher.
Highlights
:zap: API improvements
When performing inference, speed is key. For this reason, the Gradio demo and FastAPI boilerplate were updated to switch from Pytorch backend to ONNX. What does this change?
Much lower latency, and much lighter dependencies. The size of the docker image for the API is significantly smaller. Additionally, Poetry was used to handle the dependencies of the API template. For backend tasks, dependency modifications can be critical and poetry is a great tool to manage this. This also comes with a nice integration for the Dependabot :robot:
:nail_care: Cleaning project hierarchy
With new PEP conventions, Python projects can now have their whole package definition in pyproject.toml using setup.tools. By moving most configuration files to this, the project is now much leaner.
:sparkles: PolyLoss
A new SOTA candidate as default loss for model training was recently published, and this release comes with a clean implementation! Get started with your new training to try it out :running_man:
Full changelog
New Features 🚀
- feat: Switched demo backend to ONNX by @frgfm in https://github.com/frgfm/Holocron/pull/208
- feat: Added implementation of PolyLoss by @frgfm in https://github.com/frgfm/Holocron/pull/209 ### Bug Fixes 🐛
- chore: Fixed jinja2 dependency by @frgfm in https://github.com/frgfm/Holocron/pull/210
- docs: Fixed codacy badge by @frgfm in https://github.com/frgfm/Holocron/pull/216
- chore: Cleaned project dependencies by @frgfm in https://github.com/frgfm/Holocron/pull/219
- docs: Fixed author entry in pyproject by @frgfm in https://github.com/frgfm/Holocron/pull/220
- chore: Improved version specifiers and fixed conda recipe by @frgfm in https://github.com/frgfm/Holocron/pull/221
- docs: Fixed documentation build by @frgfm in https://github.com/frgfm/Holocron/pull/226
- fix: Fixed Mixup for binary & one-hot targets by @frgfm in https://github.com/frgfm/Holocron/pull/225
- fix: Fixed BinaryClassificationTrainer by @frgfm in https://github.com/frgfm/Holocron/pull/227
- fix: Fixed binary classification evaluation by @frgfm in https://github.com/frgfm/Holocron/pull/228 ### Improvements
- refactor: Updated gradio version in demo by @frgfm in https://github.com/frgfm/Holocron/pull/212
- refactor: Updated build config and documentation theme by @frgfm in https://github.com/frgfm/Holocron/pull/213
- docs: Updated documentation by @frgfm in https://github.com/frgfm/Holocron/pull/214
- [ImgBot] Optimize images by @imgbot in https://github.com/frgfm/Holocron/pull/215
- ci: Updated header verifications by @frgfm in https://github.com/frgfm/Holocron/pull/217
- style: Fixed mypy config and updated CI by @frgfm in https://github.com/frgfm/Holocron/pull/218
- feat: Added poetry for API package management by @frgfm in https://github.com/frgfm/Holocron/pull/222
- docs: Fixed README badge and updated documentation by @frgfm in https://github.com/frgfm/Holocron/pull/223
- feat: Switched API backend to ONNX by @frgfm in https://github.com/frgfm/Holocron/pull/224 ### Miscellaneous
- chore: Applied post release modifications by @frgfm in https://github.com/frgfm/Holocron/pull/207
New Contributors
- @imgbot made their first contribution in https://github.com/frgfm/Holocron/pull/215
Full Changelog: https://github.com/frgfm/Holocron/compare/v0.2.0...v0.2.1
- Python
Published by frgfm over 3 years ago
pylocron - v0.2.0: Improved performances, API boilerplate and demo app
This release greatly improves classification performances and adds numerous tools to deploy or showcase your models.
Note: holocron 0.2.0 requires PyTorch 1.9.1 and torchvision 0.10.1 or newer.
Highlights
:zebra: New entries in the model zoo
RepVGG joins the model zoo to provide an interesting change of pace: using two forward-wise equivalent architectures, one for the training and the other for the inference.
This brings a very good balance between inference speed and performances for VGG-like models, as it outclasses several ResNet architectures (cf. https://github.com/frgfm/Holocron/tree/master/references/classification).
:bookmark_tabs: Tutorial notebooks
To reduce friction between users and domain experts, a few tutorials were added to the documentation in the form of notebooks.

Thanks to Google Colab, you can run all the commands on a GPU without owning one :+1:
:computer: API boilerplate
Ever dreamt of deploying a small REST API to expose your vision models? Using the great FastAPI library, a minimal API template was implemented for you to easily deploy models in containerized environments.
Once your API is running, the following snippet:
python
import requests
with open('/path/to/your/img.jpeg', 'rb') as f:
data = f.read()
response = requests.post("http://localhost:8002/classification", files={'file': data}).json()
yields:
{'value': 'French horn', 'confidence': 0.9186984300613403}
For more information, please refer to the dedicated README.
:video_game: Gradio demo
To better showcase the capabilities of the pre-trained models, a small demo app was added to the project (with a live version hosted on HuggingFace Spaces).

It was built for basic image classification using Gradio.
:hugs: Integration with HuggingFace model hub
In order to have a more open way to contribute/share models, default configuration dicts are now accessible in every model. Thanks to this and HuggingFace Hub, checkpoints can be hosted freely (cf. https://huggingface.co/frgfm/repvgg_a0), and you can instantiate models from this.
```python from holocron.models.utils import modelfromhf_hub
model = modelfromhfhub("frgfm/repvgga0").eval() ```
This opens the way for external contributors to upload their own checkpoint & config, and use Holocron seamlessly.
:zap: Cutting-edge training scripts
This release comes with major upgrades for the reference scripts, in two aspects: - speed: adding support of Average Mixed Precision (AMP) - performance: updated the default augmentations, adding new optimizers (AdamP, AdaBelief) and regularization methods (mixup)
Those should help you to reach better results with your own experiments.
Breaking changes
License update
To better reflect the spirit of the projects of welcoming contributions from everywhere, the license was changed from MIT to Apache 2. This shouldn't impact your usage much as it is one of the most commonly used licenses for open source.
Deprecated features now supported by PyTorch
Since Holocron is meant as an addon to PyTorch/Torchvision, a few features have been deprecated as they were integrated into PyTorch. Those include:
- activations: SiLU, Mish
- optimizer: RAdam
Naming of trainer's method
The trainer's method to determine the optimal learning rate had its name changed from lr_find to find_lr.
0.1.3 | 0.2.0
-- | --
>>> trainer = ... >>> trainer.lr_find() | >>> trainer = ...
>>> trainer.find_lr() |
Full changelog
Breaking Changes 🛠
- chore: Updated license from MIT to Apache2.0 by @frgfm in https://github.com/frgfm/Holocron/pull/130
- refactor: Removed implementations of nn that are now integrated in PyTorch by @frgfm in https://github.com/frgfm/Holocron/pull/157
- refactor: Removed implementations of nn that are now integrated into PyTorch by @frgfm in https://github.com/frgfm/Holocron/pull/158
- refactor: Removed functional legacy elements by @frgfm in https://github.com/frgfm/Holocron/pull/159
- refactor: Cleaned ref script args by @frgfm in https://github.com/frgfm/Holocron/pull/199 ### New Features 🚀
- feat: Added pretrained URL for SKNet-50 by @frgfm in https://github.com/frgfm/Holocron/pull/102
- feat: Added support of Triplet Attention by @frgfm in https://github.com/frgfm/Holocron/pull/104
- feat: Added support of RepVGG by @frgfm in https://github.com/frgfm/Holocron/pull/115
- feat: Added pretrained versions of RepVGG models by @frgfm in https://github.com/frgfm/Holocron/pull/116
- Adaptative classification trainer by @MateoLostanlen in https://github.com/frgfm/Holocron/pull/108
- feat: Added trainer for binary classification by @MateoLostanlen in https://github.com/frgfm/Holocron/pull/118
- feat: Added implementation of AdamP by @frgfm in https://github.com/frgfm/Holocron/pull/121
- feat: Added CIFAR to classification training option by @frgfm in https://github.com/frgfm/Holocron/pull/122
- feat: Added StackUpsample2d by @frgfm in https://github.com/frgfm/Holocron/pull/132
- docs: Switched to multi-version documentation by @frgfm in https://github.com/frgfm/Holocron/pull/134
- feat: Pretrained params for unet_rexnet13 by @frgfm in https://github.com/frgfm/Holocron/pull/139
- docs: Created code of conduct by @frgfm in https://github.com/frgfm/Holocron/pull/142
- feat: Added implementation of Involution layers by @frgfm in https://github.com/frgfm/Holocron/pull/144
- feat: Added support of AMP to trainers and training scripts by @frgfm in https://github.com/frgfm/Holocron/pull/153
- feat: Added custom weight decay for normalization layers by @frgfm in https://github.com/frgfm/Holocron/pull/162
- docs: Added latency benchmark in the README by @frgfm in https://github.com/frgfm/Holocron/pull/167
- feat: Added Dice Loss implementation by @frgfm in https://github.com/frgfm/Holocron/pull/191
- feat: Added default_cfg to all classification models by @frgfm in https://github.com/frgfm/Holocron/pull/193
- feat: Added FastAPI boilerplate for image classification by @frgfm in https://github.com/frgfm/Holocron/pull/195
- feat: Added Gradio demo app by @frgfm in https://github.com/frgfm/Holocron/pull/194
- feat: Added possibility to load model from HF Hub by @frgfm in https://github.com/frgfm/Holocron/pull/198
- docs: Added tutorial notebooks by @frgfm in https://github.com/frgfm/Holocron/pull/201 ### Bug Fixes 🐛
- fix: Fixed compatibility with pytorch 1.7.0 by @frgfm in https://github.com/frgfm/Holocron/pull/103
- chore: Fixed doc deploy by @frgfm in https://github.com/frgfm/Holocron/pull/105
- fix: Fixed SKNet model definitions by @frgfm in https://github.com/frgfm/Holocron/pull/106
- fix: Fixed CIoU aspect ratio term by @frgfm in https://github.com/frgfm/Holocron/pull/114
- docs: Fixed README typo by @MateoLostanlen in https://github.com/frgfm/Holocron/pull/117
- fix: Fixed UNet architecture and improved trainer by @frgfm in https://github.com/frgfm/Holocron/pull/127
- fix: Fixed console print from resume training by @frgfm in https://github.com/frgfm/Holocron/pull/129
- docs: Fixed typo in README by @frgfm in https://github.com/frgfm/Holocron/pull/133
- docs: Fixed multi-version references by @frgfm in https://github.com/frgfm/Holocron/pull/135
- fix: Fixed loss weight buffer by @frgfm in https://github.com/frgfm/Holocron/pull/136
- fix: Updated import of loadstatedictfromurl by @frgfm in https://github.com/frgfm/Holocron/pull/148
- chore: Cleaned package index mixup by @frgfm in https://github.com/frgfm/Holocron/pull/150
- fix: Fixed LR Finder plot scaling by @frgfm in https://github.com/frgfm/Holocron/pull/147
- docs: Fixed documentation build by @frgfm in https://github.com/frgfm/Holocron/pull/149
- fix: Fixed DropBlock2d drop_prob by @frgfm in https://github.com/frgfm/Holocron/pull/156
- fix: Fixed error message of optimizers by @frgfm in https://github.com/frgfm/Holocron/pull/161
- fix: Fixed LR Find when loss explodes by @frgfm in https://github.com/frgfm/Holocron/pull/169
- fix: Fixed classification training script for CIFAR by @frgfm in https://github.com/frgfm/Holocron/pull/171
- fix: Fixed param freezing by @frgfm in https://github.com/frgfm/Holocron/pull/175
- fix: Fixes MCLoss and RandomCrop in the segmentation training script by @frgfm in https://github.com/frgfm/Holocron/pull/177
- docs: Fixed latency section of the README by @frgfm in https://github.com/frgfm/Holocron/pull/178
- fix: Fixed LR find plotting by @frgfm in https://github.com/frgfm/Holocron/pull/180
- fix: Fixed multiple detection training & model issues by @frgfm in https://github.com/frgfm/Holocron/pull/182
- ci: Fixed script for PR label by @frgfm in https://github.com/frgfm/Holocron/pull/186
- ci: Fixed CI job for PR labels by @frgfm in https://github.com/frgfm/Holocron/pull/187
- ci: Added new CI job by @frgfm in https://github.com/frgfm/Holocron/pull/188
- ci: Fixed message & improved trigger by @frgfm in https://github.com/frgfm/Holocron/pull/190 ### Improvements
- chore: Updated package version and build jobs by @frgfm in https://github.com/frgfm/Holocron/pull/101
- feat: Updated training script by @frgfm in https://github.com/frgfm/Holocron/pull/89
- test: Refactored unittest for ClassificationTrainer by @frgfm in https://github.com/frgfm/Holocron/pull/119
- docs: Added issue templates by @frgfm in https://github.com/frgfm/Holocron/pull/120
- feat: Updated UNet and improved training scripts by @frgfm in https://github.com/frgfm/Holocron/pull/124
- test: Switched to pytest suite by @frgfm in https://github.com/frgfm/Holocron/pull/131
- feat: Improved Seg IoU computation and segmentation metrics by @frgfm in https://github.com/frgfm/Holocron/pull/137
- feat: Improved UNet architectures by @frgfm in https://github.com/frgfm/Holocron/pull/138
- style: Fixed typing of TridentNet by @frgfm in https://github.com/frgfm/Holocron/pull/141
- docs: Removed legacy entries and fixes models' documentation by @frgfm in https://github.com/frgfm/Holocron/pull/145
- style: Reordered imports and added isort check by @frgfm in https://github.com/frgfm/Holocron/pull/151
- refactor: Removes unused imports and updated README badge by @frgfm in https://github.com/frgfm/Holocron/pull/152
- refactor: Removed unused imports by @frgfm in https://github.com/frgfm/Holocron/pull/154
- feat: Improved Mixup design and added it to classification recipe by @frgfm in https://github.com/frgfm/Holocron/pull/155
- test: Increased coverage of holocron.optim by @frgfm in https://github.com/frgfm/Holocron/pull/160
- feat: Improved training scripts and added updated pretrained weights by @frgfm in https://github.com/frgfm/Holocron/pull/163
- docs: Improved documentation landing page by @frgfm in https://github.com/frgfm/Holocron/pull/165
- docs: Updated contribution guidelines and added utils by @frgfm in https://github.com/frgfm/Holocron/pull/166
- refactor: Removed unused imports, variables and wrappers by @frgfm in https://github.com/frgfm/Holocron/pull/168
- feat: Make bias addition automatic in conv_sequence by @frgfm in https://github.com/frgfm/Holocron/pull/170
- feat: Updates the backbone & docstring of YOLOv4 by @frgfm in https://github.com/frgfm/Holocron/pull/172
- style: Updated flake8 config by @frgfm in https://github.com/frgfm/Holocron/pull/174
- refactor: Refactored holocron.trainer by @frgfm in https://github.com/frgfm/Holocron/pull/173
- refactor: Updated arg of MCLoss by @frgfm in https://github.com/frgfm/Holocron/pull/176
- ci: Updated isort config and related CI job by @frgfm in https://github.com/frgfm/Holocron/pull/179
- feat: Added finite loss safeguard in trainer by @frgfm in https://github.com/frgfm/Holocron/pull/181
- refactor: Removed contiguous params since torch>=1.7.0 includes it by @frgfm in https://github.com/frgfm/Holocron/pull/183
- refactor: Updated timing function for latency eval by @frgfm in https://github.com/frgfm/Holocron/pull/184
- ci: Revamped CI and quality checks for upcoming release by @frgfm in https://github.com/frgfm/Holocron/pull/185
- ci: Updated message of PR label by @frgfm in https://github.com/frgfm/Holocron/pull/189
- ci: Moved header & deps checks to separate jobs by @frgfm in https://github.com/frgfm/Holocron/pull/192
- docs: Updates the README and documentation by @frgfm in https://github.com/frgfm/Holocron/pull/196
- docs: Added CITATION file by @frgfm in https://github.com/frgfm/Holocron/pull/197
- docs: Added example snippet & Colab ref in README by @frgfm in https://github.com/frgfm/Holocron/pull/202 ### Miscellaneous
- refactor: Updated arg parsing by @frgfm in https://github.com/frgfm/Holocron/pull/200
New Contributors
- @MateoLostanlen made their first contribution in https://github.com/frgfm/Holocron/pull/117
Full Changelog: https://github.com/frgfm/Holocron/compare/v0.1.3...v0.2.0
- Python
Published by frgfm about 4 years ago
pylocron - Task-specific model trainers and new losses & layers
This minor release introduces new losses, layers and trainer objects, on top of heavy refactoring. Annotation typing was added to the codebase to improve CI checks.
Note: holocron 0.1.3 requires PyTorch 1.5.1 and torchvision 0.6.1 or newer.
Highlights
models
Implementations of deep learning models New - Added implementations of Res2Net (#63, #91), TridentNet (#64, #82), ResNet-50D (#65), PyConvResNet & PyConvHGResNet (#66), CSPDarknet53 (#77, #87), SKNet (#96) - Added implementation of YOLOv4 (#78) - Added pretrained URLs for Darknets (#71), CSPDarknet53 (#87), ResNet50D (#87), TridentNet50 (#87), Res2Net (#92) - Updated pretrained URLs of ReXNet (#87) - Updated conv_sequence (#94)
Improvements - Improved pooling efficiency (#65) - Refactored model implementations (#67, #78, #99)
Fixes - Fixed pretrained URLs of ResNet, ReXNet (#61) - Fixed implementations of Darknet & YOLO (#69, #70, #72 #74, #75, #83)
nn
Neural networks building blocks New - Added implementations of HardMish (#62), PyConv2d (#66), FReLU (#73), ClassBalancedWrapper (#76), BlurPool2d (#80), ComplementCrossEntropy (#90), SAM & SPP (#94), LambdaLayer (#95), MutualChannelLoss (#100)
optim
Optimizer and learning rate schedulers New - Added implementation of AdaBelief (#93)
Improvements - Refactored existing optimizers (#93)
ops
Optimizer and learning rate schedulers New - Added implementation of Generalized IoU loss and Complete IoU loss (#78, #88)
trainer
Utility objects for easier training on different tasks New - Added Trainer, ClassificationTrainer (#81), SegmentationTrainer and DetectionTrainer (#83)
References
Verifications of the package well-being before release Improvements - Refactored training scripts (#68, #72, #81, #83, #84) - Added contiguous param trick (#79)
Fixes - Fixed detection script (#84, #85), segmentation script (#86)
Others
Improvements - Optimized cache (#98, #99) - Added annotation typing to package (#99)
- Python
Published by frgfm over 5 years ago
pylocron - Object detection, segmentation and new layers
This minor release introduces new model tasks and training scripts. In the release attachments, you will find remapped ReXNet ImageNet pretrained weights from https://github.com/clovaai/rexnet, ImageNette pretrained weights from the repo owner.
Note: holocron 0.1.2 requires PyTorch 1.5.1 and torchvision 0.6.1 or newer.
Highlights
models
Implementations of deep learning models New - Added implementations of UNet (#43), UNet++ (#46), and UNet3+ (#47) - Added implementation of ResNet (#55), ReXNet (#56, #58, #59, #60)
Improvements - Updated Darknet pretrained models (#32) - Improved Darknet flexibility (#45)
Fixes - Fixed YOLO inference and loss (#38)
nn
Neural networks building blocks New - Added implementations for Add2d (#35), NormConv (#34), SlimConv (#36, #49) - Added Dropblock implementation (#53) - Added implementations of SiLU/Swish (#54, #57)
Improvements - Improved efficiency of ConcatDownsample2d (#48)
optim
Optimizer and learning rate schedulers New - Added implementation of TAdam (#52)
Improvements - Added support for rendering in notebooks (#39) - Fixed inplace add operator usage in optimizers (#40, #42)
Documentation
Online resources for potential users Improvements - Improved docstring for better understanding (#37,
References
Verifications of the package well-being before release New - Added training script for object detection (#41) - Added training script for semantic segmentation (#50)
Others
Improvements - Cleaned codebase (#44, #51)
Fixes - Fixed conda upload job (#33)
- Python
Published by frgfm over 5 years ago
pylocron - Pretrained models for image classification
This minor release updates some model pretrained weights and documentation.
Note: holocron 0.1.1 requires PyTorch 1.2 and torchvision 0.4 or newer.
Highlights
models
Implementations of deep learning models Improvements - Add pretrained wiehgts for Darknet-24, Darknet-19 and Darknet-53 (#29, #30)
Documentation
Online resources for potential users Improvements - Updated docstring references (#31) - Added installation instructions (#31) - Cleaned documentation hierarchy (#31) - Adding website referencing (#31)
References
Verifications of the package well-being before release Improvements - Updated result reported in README (#30)
- Python
Published by frgfm almost 6 years ago
pylocron - Pretrained for image classification
This release adds implementations of both image classification and object detection models.
Note: holocron 0.1.0 requires PyTorch 1.2 and torchvision 0.4 or newer.
Highlights
models
Implementations of deep learning models New - Add implementations of Darknet-24, Darknet-19 and Darknet-53 (#20, #22, #23, #24) - Add implementations of YOLOv1 and YOLOv2 (#22, #23).
nn
Neural networks building blocks
New
- Add weight initialization function (#24)
- Add mish & nl_relu activations
- Add implementations of focal loss, multi label cross-entropy loss and label smoothing cross-entropy loss (#16, #17, #25)
- Add mixup loss wrapper (#27)
ops
High-performance batch operations New - Add implementations of distance IoU and complete IoU losses (#12)
optim
Optimizer and learning rate schedulers New - Add implementations for LARS, Lamb, RAdam, and Lookahead (#6) - Add an implementation of OneCycle scheduler
Documentation
Online resources for potential users New - Add sphinx automatic documentation build for existing features (#7, #8, #13, #21) - Add contribution guidelines (#1) - Add installation & usage instructions in readme (#1, #2)
References
Verifications of the package well-being before release New - Add a training script for Imagenette (#28)
Others
Other tools and implementations - Add ̀lrfinder` to estimate the optimal starting learning rate (#26). - Add 'mixupcollate` to use Mixup on existing DataLoader (#27)
- Python
Published by frgfm almost 6 years ago