https://github.com/aidinhamedi/custom-onecyclelr-pytorch
A custom implementation of the OneCycle learning rate scheduler for PyTorch.
Science Score: 26.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.3%) to scientific vocabulary
Keywords
Repository
A custom implementation of the OneCycle learning rate scheduler for PyTorch.
Basic Info
Statistics
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
OneCycle Learning Rate Scheduler
A custom implementation of the OneCycle learning rate scheduler for PyTorch.
Features
- Customized version of the OneCycleLR algorithm with four distinct phases: warmup, idling, annealing, and decay.
- Flexibility in defining various hyperparameters such as:
- Warmup iterations and type (linear or exponential)
- Idling period duration
- Annealing phase duration and minimum learning rate
- Decay phase duration and minimum learning rate
- Compatibility with any PyTorch optimizer
Installation
bash
pip install custom-onecyclelr
Usage
Here's an example of how to integrate the scheduler into your training loop:
```python import torch from custom_onecyclelr import scheduler
Initialize model and optimizer
model = YourModel() optimizer = torch.optim.SGD(model.parameters(), lr=1e-3)
Create the OneCycleLR scheduler with desired parameters
schedulerinstance = scheduler.OneCycleLr( optimizer, warmupiters=6, # Number of iterations for the warmup phase lridlingiters=8, # Number of iterations where learning rate remains at max annealingiters=56, # Cosine annealing phase duration decayiters=100, # Linear decay phase duration maxlr=0.01, annealinglrmin=0.001, decaylrmin=0.0001, warmupstartlr=0.0001, warmuptype="exp" # "linear" or "exp" )
Training loop
for epoch in range(totalepochs): for inputs, targets in dataloader: optimizer.zerograd() outputs = model(inputs) loss = criterion(outputs, targets) loss.backward() optimizer.step()
scheduler_instance.step()
```
Visualization

You can visualize how the learning rate changes over iterations by running:
bash
python examples/vis.py
This will generate a plot showing the different phases of the learning rate schedule.
License
This project is licensed under MIT License - see LICENSE for details.
Owner
- Name: Aidin
- Login: AidinHamedi
- Kind: user
- Repositories: 1
- Profile: https://github.com/AidinHamedi
Segmentation fault
GitHub Events
Total
- Watch event: 1
- Public event: 1
- Push event: 5
Last Year
- Watch event: 1
- Public event: 1
- Push event: 5
Issues and Pull Requests
Last synced: 8 months ago
All Time
- Total issues: 0
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 20 last-month
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 3
- Total maintainers: 1
pypi.org: custom-onecyclelr
A Custom PyTorch implementation of the OneCycleLR learning rate scheduler. (With some modifications)
- Homepage: https://github.com/AidinHamedi/Custom-OneCycleLr-Pytorch
- Documentation: https://custom-onecyclelr.readthedocs.io/
- License: MIT License Copyright (c) 2025 Aidin Hamedi Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
-
Latest release: 0.1.4
published 8 months ago
Rankings
Maintainers (1)
Dependencies
- colorama 0.4.6
- contourpy 1.3.1
- cycler 0.12.1
- exceptiongroup 1.2.2
- filelock 3.18.0
- fonttools 4.56.0
- fsspec 2025.3.0
- iniconfig 2.0.0
- isort 6.0.1
- jinja2 3.1.6
- kiwisolver 1.4.8
- markupsafe 3.0.2
- matplotlib 3.10.1
- mpmath 1.3.0
- networkx 3.4.2
- numpy 2.2.4
- packaging 24.2
- pillow 11.1.0
- pluggy 1.5.0
- pyparsing 3.2.1
- pytest 8.3.5
- python-dateutil 2.9.0.post0
- ruff 0.11.0
- setuptools 76.0.0
- six 1.17.0
- sympy 1.13.1
- tomli 2.2.1
- torch 2.6.0+cpu
- typing-extensions 4.12.2