DiffOpt
Differentiating optimization programs w.r.t. program parameters
Science Score: 77.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 2 DOI reference(s) in README -
✓Academic publication links
Links to: arxiv.org -
✓Committers with academic emails
2 of 19 committers (10.5%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.5%) to scientific vocabulary
Keywords
Keywords from Contributors
Repository
Differentiating optimization programs w.r.t. program parameters
Basic Info
- Host: GitHub
- Owner: jump-dev
- License: mit
- Language: Julia
- Default Branch: master
- Homepage: https://jump.dev/DiffOpt.jl/
- Size: 13.2 MB
Statistics
- Stars: 131
- Watchers: 11
- Forks: 14
- Open Issues: 22
- Releases: 10
Topics
Metadata Files
README.md
DiffOpt.jl
DiffOpt.jl is a package for differentiating convex optimization programs with respect to the program parameters. DiffOpt currently supports linear, quadratic, and conic programs.
License
DiffOpt.jl is licensed under the
MIT License.
Installation
Install DiffOpt using Pkg.add:
julia
import Pkg
Pkg.add("DiffOpt")
Documentation
The documentation for DiffOpt.jl includes a detailed description of the theory behind the package, along with examples, tutorials, and an API reference.
Use with JuMP
DiffOpt-JuMP API with Parameters
Here is an example with a Parametric Linear Program:
```julia using JuMP, DiffOpt, HiGHS
model = DiffOpt.quadraticdiffmodel(HiGHS.Optimizer) set_silent(model)
pval = 4.0 pcval = 2.0 @variable(model, x) @variable(model, p in Parameter(pval)) @variable(model, pc in Parameter(pcval)) @constraint(model, cons, pc * x >= 3 * p) @objective(model, Min, 2x) optimize!(model) @show value(x) == 3 * pval / pcval
the function is
x(p, pc) = 3p / pc
hence,
dx/dp = 3 / pc
dx/dpc = -3p / pc^2
First, try forward mode AD
differentiate w.r.t. p
directionp = 3.0 DiffOpt.setforwardparameter(model, p, directionp) DiffOpt.forwarddifferentiate!(model) @show DiffOpt.getforwardvariable(model, x) == directionp * 3 / pc_val
update p and pc
pval = 2.0 pcval = 6.0 setparametervalue(p, pval) setparametervalue(pc, pcval)
re-optimize
optimize!(model)
check solution
@show value(x) ≈ 3 * pval / pcval
stop differentiating with respect to p
DiffOpt.emptyinputsensitivities!(model)
differentiate w.r.t. pc
directionpc = 10.0 DiffOpt.setforwardparameter(model, pc, directionpc) DiffOpt.forwarddifferentiate!(model) @show abs(DiffOpt.getforwardvariable(model, x) - -directionpc * 3 * pval / pcval^2) < 1e-5
always a good practice to clear previously set sensitivities
DiffOpt.emptyinputsensitivities!(model)
Now, reverse model AD
directionx = 10.0 DiffOpt.setreversevariable(model, x, directionx) DiffOpt.reversedifferentiate!(model) @show DiffOpt.getreverseparameter(model, p) == directionx * 3 / pcval @show DiffOpt.getreverseparameter(model, pc) == -directionx * 3 * pval / pcval^2 ```
Available models:
* DiffOpt.quadratic_diff_model: Quadratic Programs (QP) and Linear Programs
(LP)
* DiffOpt.conic_diff_model: Conic Programs (CP) and Linear Programs (LP)
* DiffOpt.nonlinear_diff_model: Nonlinear Programs (NLP), Quadratic Program
(QP) and Linear Programs (LP)
* DiffOpt.diff_model: Nonlinear Programs (NLP), Conic Programs (CP),
Quadratic Programs (QP) and Linear Programs (LP)
Citing DiffOpt.jl
If you find DiffOpt.jl useful in your work, we kindly request that you cite the
following paper:
bibtex
@article{besancon2023diffopt,
title={Flexible Differentiable Optimization via Model Transformations},
author={Besançon, Mathieu and Dias Garcia, Joaquim and Legat, Beno{\^\i}t and Sharma, Akshay},
journal={INFORMS Journal on Computing},
year={2023},
volume={36},
number={2},
pages={456--478},
doi={10.1287/ijoc.2022.0283},
publisher={INFORMS}
}
A preprint of this paper is freely available.
GSOC2020
DiffOpt began as a NumFOCUS sponsored Google Summer of Code (2020) project
Owner
- Name: JuMP-dev
- Login: jump-dev
- Kind: organization
- Website: https://jump.dev/
- Twitter: JuMPjl
- Repositories: 54
- Profile: https://github.com/jump-dev
An organization for the JuMP modeling language and related repositories.
Citation (CITATION.bib)
@article{doi:10.1287/ijoc.2022.0283,
author = {Besan\c{c}on, Mathieu and Dias Garcia, Joaquim and Legat, Beno\^{\i}t and Sharma, Akshay},
title = {Flexible Differentiable Optimization via Model Transformations},
journal = {INFORMS Journal on Computing},
doi = {10.1287/ijoc.2022.0283},
URL = {https://doi.org/10.1287/ijoc.2022.0283},
abstract = { We introduce DiffOpt.jl, a Julia library to differentiate through the solution of optimization problems with respect to arbitrary parameters present in the objective and/or constraints. The library builds upon MathOptInterface, thus leveraging the rich ecosystem of solvers and composing well with modeling languages like JuMP. DiffOpt offers both forward and reverse differentiation modes, enabling multiple use cases from hyperparameter optimization to backpropagation and sensitivity analysis, bridging constrained optimization with end-to-end differentiable programming. DiffOpt is built on two known rules for differentiating quadratic programming and conic programming standard forms. However, thanks to its ability to differentiate through model transformations, the user is not limited to these forms and can differentiate with respect to the parameters of any model that can be reformulated into these standard forms. This notably includes programs mixing affine conic constraints and convex quadratic constraints or objective function.History: Accepted by Ted Ralphs, Area Editor for Software Tools.Funding: The work of A. Sharma on DiffOpt.jl was funded by the Google Summer of Code program through NumFocus. M. Besançon was partially supported through the Research Campus Modal funded by the German Federal Ministry of Education and Research [Grant 05M14ZAM, 05M20ZBM]. J. Dias Garcia was supported in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior – Brasil (CAPES) – Finance Code 001. B. Legat was supported by a BAEF Postdoctoral Fellowship, the NSF [Grant OAC-1835443], and the ERC Adv. [Grant 885682].Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information (https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2022.0283), as well as from the IJOC GitHub software repository (https://github.com/INFORMSJoC/2022.0283). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/. }
}
GitHub Events
Total
- Create event: 23
- Commit comment event: 6
- Release event: 2
- Issues event: 29
- Watch event: 7
- Delete event: 19
- Member event: 1
- Issue comment event: 70
- Push event: 157
- Pull request review event: 83
- Pull request review comment event: 114
- Pull request event: 49
Last Year
- Create event: 23
- Commit comment event: 6
- Release event: 2
- Issues event: 29
- Watch event: 7
- Delete event: 19
- Member event: 1
- Issue comment event: 70
- Push event: 157
- Pull request review event: 83
- Pull request review comment event: 114
- Pull request event: 49
Committers
Last synced: 8 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Akshay Sharma | c****t@i****m | 189 |
| Mathieu Besançon | m****n@g****m | 178 |
| joaquim | j****a@p****m | 91 |
| Benoît Legat | b****t@g****m | 89 |
| Jinrae Kim | k****3@g****m | 14 |
| Oscar Dowson | o****w | 9 |
| Vitor Fernandes Egger | v****r@p****m | 8 |
| github-actions[bot] | 4****] | 7 |
| mzagorowska | 7****a | 3 |
| vfegger | 8****r | 3 |
| Andrew Rosemberg | a****g@g****m | 2 |
| Alex Robson | A****n | 1 |
| Guilherme Bodin | 3****n | 1 |
| Lyndon White | o****x@u****u | 1 |
| Michael J. Curry | c****y@c****u | 1 |
| Niklas Schmitz | n****z@g****m | 1 |
| Silvio Traversaro | s****o@t****t | 1 |
| CompatHelper Julia | c****y@j****g | 1 |
| willtebbutt | w****1@m****k | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 55
- Total pull requests: 122
- Average time to close issues: 7 months
- Average time to close pull requests: 15 days
- Total issue authors: 14
- Total pull request authors: 13
- Average comments per issue: 1.91
- Average comments per pull request: 1.94
- Merged pull requests: 95
- Bot issues: 0
- Bot pull requests: 7
Past Year
- Issues: 21
- Pull requests: 47
- Average time to close issues: 29 days
- Average time to close pull requests: 13 days
- Issue authors: 7
- Pull request authors: 4
- Average comments per issue: 0.48
- Average comments per pull request: 1.51
- Merged pull requests: 28
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- joaquimg (20)
- andrewrosemberg (8)
- blegat (7)
- JinraeKim (4)
- matbesancon (3)
- AlexRobson (2)
- klamike (2)
- akshay326 (2)
- dmayfrank (1)
- innuo (1)
- odow (1)
- frapac (1)
- Giovanni3A (1)
- JuliaTagBot (1)
Pull Request Authors
- blegat (36)
- joaquimg (27)
- odow (16)
- matbesancon (13)
- andrewrosemberg (13)
- github-actions[bot] (7)
- mzagorowska (6)
- JinraeKim (2)
- currymj (1)
- AlexRobson (1)
- traversaro (1)
- guilhermebodin (1)
- vfegger (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- julia 3 total
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 10
juliahub.com: DiffOpt
Differentiating optimization programs w.r.t. program parameters
- Homepage: https://jump.dev/DiffOpt.jl/
- Documentation: https://docs.juliahub.com/General/DiffOpt/stable/
- License: MIT
-
Latest release: 0.5.0
published 12 months ago