https://github.com/jump-dev/diffopt.jl
Differentiating convex optimization programs w.r.t. program parameters
https://github.com/jump-dev/diffopt.jl
differentiable-programming julia mathematical-modelling optimization
Last synced: 15 days ago
JSON representation
Differentiating convex optimization programs w.r.t. program parameters
- Host: GitHub
- URL: https://github.com/jump-dev/diffopt.jl
- Owner: jump-dev
- License: mit
- Created: 2020-05-21T20:22:05.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2025-03-24T20:24:50.000Z (about 1 month ago)
- Last Synced: 2025-04-09T23:14:47.548Z (15 days ago)
- Topics: differentiable-programming, julia, mathematical-modelling, optimization
- Language: Julia
- Homepage: https://jump.dev/DiffOpt.jl/stable
- Size: 11.2 MB
- Stars: 127
- Watchers: 11
- Forks: 14
- Open Issues: 20
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Citation: CITATION.bib
Awesome Lists containing this project
README
# DiffOpt.jl
[](https://jump.dev/DiffOpt.jl/stable)
[](https://jump.dev/DiffOpt.jl/dev)
[](https://github.com/jump-dev/DiffOpt.jl/actions?query=workflow%3ACI)
[](https://codecov.io/gh/jump-dev/DiffOpt.jl)[DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl) is a package for
differentiating convex optimization programs with respect to the program
parameters. DiffOpt currently supports linear, quadratic, and conic programs.## License
`DiffOpt.jl` is licensed under the
[MIT License](https://github.com/jump-dev/DiffOpt.jl/blob/master/LICENSE.md).## Installation
Install DiffOpt using `Pkg.add`:
```julia
import Pkg
Pkg.add("DiffOpt")
```## Documentation
The [documentation for DiffOpt.jl](https://jump.dev/DiffOpt.jl/stable/)
includes a detailed description of the theory behind the package, along with
examples, tutorials, and an API reference.## Use with JuMP
### DiffOpt-JuMP API with `Parameters`
```julia
using JuMP, DiffOpt, HiGHSmodel = Model(
() -> DiffOpt.diff_optimizer(
HiGHS.Optimizer;
with_parametric_opt_interface = true,
),
)
set_silent(model)p_val = 4.0
pc_val = 2.0
@variable(model, x)
@variable(model, p in Parameter(p_val))
@variable(model, pc in Parameter(pc_val))
@constraint(model, cons, pc * x >= 3 * p)
@objective(model, Min, 2x)
optimize!(model)
@show value(x) == 3 * p_val / pc_val# the function is
# x(p, pc) = 3p / pc
# hence,
# dx/dp = 3 / pc
# dx/dpc = -3p / pc^2# First, try forward mode AD
# differentiate w.r.t. p
direction_p = 3.0
MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(p), Parameter(direction_p))
DiffOpt.forward_differentiate!(model)
@show MOI.get(model, DiffOpt.ForwardVariablePrimal(), x) == direction_p * 3 / pc_val# update p and pc
p_val = 2.0
pc_val = 6.0
set_parameter_value(p, p_val)
set_parameter_value(pc, pc_val)
# re-optimize
optimize!(model)
# check solution
@show value(x) ≈ 3 * p_val / pc_val# stop differentiating with respect to p
DiffOpt.empty_input_sensitivities!(model)
# differentiate w.r.t. pc
direction_pc = 10.0
MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(pc), Parameter(direction_pc))
DiffOpt.forward_differentiate!(model)
@show abs(MOI.get(model, DiffOpt.ForwardVariablePrimal(), x) -
-direction_pc * 3 * p_val / pc_val^2) < 1e-5# always a good practice to clear previously set sensitivities
DiffOpt.empty_input_sensitivities!(model)
# Now, reverse model AD
direction_x = 10.0
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, direction_x)
DiffOpt.reverse_differentiate!(model)
@show MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(p)) == MOI.Parameter(direction_x * 3 / pc_val)
@show abs(MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(pc)).value -
-direction_x * 3 * p_val / pc_val^2) < 1e-5
```### Low level DiffOpt-JuMP API:
A brief example:
```julia
using JuMP, DiffOpt, HiGHS
# Create a model using the wrapper
model = Model(() -> DiffOpt.diff_optimizer(HiGHS.Optimizer))
# Define your model and solve it
@variable(model, x)
@constraint(model, cons, x >= 3)
@objective(model, Min, 2x)
optimize!(model)
# Choose the problem parameters to differentiate with respect to, and set their
# perturbations.
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, 1.0)
# Differentiate the model
DiffOpt.reverse_differentiate!(model)
# fetch the gradients
grad_exp = MOI.get(model, DiffOpt.ReverseConstraintFunction(), cons) # -3 x - 1
constant(grad_exp) # -1
coefficient(grad_exp, x) # -3
```## Citing DiffOpt.jl
If you find DiffOpt.jl useful in your work, we kindly request that you cite the
following [paper](https://pubsonline.informs.org/doi/10.1287/ijoc.2022.0283):
```bibtex
@article{besancon2023diffopt,
title={Flexible Differentiable Optimization via Model Transformations},
author={Besançon, Mathieu and Dias Garcia, Joaquim and Legat, Beno{\^\i}t and Sharma, Akshay},
journal={INFORMS Journal on Computing},
year={2023},
volume={36},
number={2},
pages={456--478},
doi={10.1287/ijoc.2022.0283},
publisher={INFORMS}
}
```
A preprint of this paper is [freely available](https://arxiv.org/abs/2206.06135).## GSOC2020
DiffOpt began as a [NumFOCUS sponsored Google Summer of Code (2020) project](https://summerofcode.withgoogle.com/organizations/4727917315096576/?sp-page=2#5232064888045568)