-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] POI + DiffOpt = S2 #143
Open
joaquimg
wants to merge
13
commits into
master
Choose a base branch
from
jg/diff
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 1 commit
Commits
Show all changes
13 commits
Select commit
Hold shift + click to select a range
7f44ee2
POI + DiffOpt = S2
1382d57
fix POI (pp), add reverse mode, add tests
e68d603
fix format
b8d5df4
add more tests
5fc6ce2
fix test for correct quadratic constraint interpretation
a806f3a
fix test
b303b32
add format
e267943
fix test
6094763
format diff tests
0beddb8
add tests
0900497
remove GLPK
1493fac
fix format
9bafba7
concentrate changes
joaquimg File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,228 @@ | ||
using DiffOpt | ||
|
||
# forward mode | ||
|
||
function DiffOpt.forward_differentiate!(model::Optimizer{T}) where {T} | ||
# TODO: add a reset option | ||
for (F, S) in keys(model.affine_constraint_cache.dict) | ||
affine_constraint_cache_inner = model.affine_constraint_cache[F, S] | ||
if !isempty(affine_constraint_cache_inner) | ||
# TODO add: barrier to avoid type instability of inner dicts | ||
for (inner_ci, pf) in affine_constraint_cache_inner | ||
cte = zero(T) | ||
terms = MOI.ScalarAffineTerm{T}[] | ||
sizehint!(terms, 0) | ||
if length(pf.p) != 0 | ||
for term in pf.p | ||
p = p_idx(term.variable) | ||
sensitivity = get(model.parameter_input_forward, p, 0.0) | ||
# TODO: check sign | ||
cte += sensitivity * term.coefficient | ||
end | ||
# TODO: if cte != 0 | ||
MOI.set( | ||
model.optimizer, | ||
DiffOpt.ForwardConstraintFunction(), | ||
inner_ci, | ||
MOI.ScalarAffineFunction{T}(terms, cte), | ||
) | ||
end | ||
end | ||
end | ||
end | ||
for (F, S) in keys(model.vector_affine_constraint_cache.dict) | ||
vector_affine_constraint_cache_inner = | ||
model.vector_affine_constraint_cache[F, S] | ||
if !isempty(vector_affine_constraint_cache_inner) | ||
# barrier to avoid type instability of inner dicts | ||
for (inner_ci, pf) in vector_affine_constraint_cache_inner | ||
cte = zeros(T, length(pf.c)) | ||
terms = MOI.VectorAffineTerm{T}[] | ||
sizehint!(terms, 0) | ||
if length(pf.p) != 0 | ||
for term in pf.p | ||
p = p_idx(term.scalar_term.variable) | ||
sensitivity = get(model.parameter_input_forward, p, 0.0) | ||
# TODO: check sign | ||
cte[term.output_index] += sensitivity * term.scalar_term.coefficient | ||
end | ||
# TODO: if cte != 0 | ||
MOI.set( | ||
model.optimizer, | ||
DiffOpt.ForwardConstraintFunction(), | ||
inner_ci, | ||
MOI.ScalarAffineFunction{T}(terms, cte), | ||
) | ||
end | ||
end | ||
end | ||
end | ||
for (F, S) in keys(model.quadratic_constraint_cache.dict) | ||
quadratic_constraint_cache_inner = | ||
model.quadratic_constraint_cache[F, S] | ||
if !isempty(quadratic_constraint_cache_inner) | ||
# TODO add: barrier to avoid type instability of inner dicts | ||
for (inner_ci, pf) in quadratic_constraint_cache_inner | ||
cte = zero(T) | ||
terms = MOI.ScalarAffineTerm{T}[] | ||
# terms_dict = Dict{MOI.VariableIndex,T}() # canonicalize? | ||
sizehint!(terms, length(pf.pv)) | ||
if length(pf.p) != 0 || length(pf.pv) != 0 || length(pf.pp) != 0 | ||
for term in pf.p | ||
p = p_idx(term.variable) | ||
sensitivity = get(model.parameter_input_forward, p, 0.0) | ||
# TODO: check sign | ||
cte += sensitivity * term.coefficient | ||
end | ||
for term in pf.pp | ||
p_1 = p_idx(term.variable_1) | ||
p_2 = p_idx(term.variable_2) | ||
sensitivity_1 = get(model.parameter_input_forward, p_1, 0.0) | ||
sensitivity_2 = get(model.parameter_input_forward, p_2, 0.0) | ||
cte += sensitivity_1 * sensitivity_2 * term.coefficient | ||
end | ||
# canonicalize? | ||
for term in pf.pv | ||
p = p_idx(term.variable_1) | ||
sensitivity = get(model.parameter_input_forward, p, NaN) | ||
if !isnan(sensitivity) | ||
push!( | ||
terms, | ||
MOI.ScalarAffineTerm{T}( | ||
sensitivity * term.coefficient, | ||
term.variable_2, | ||
), | ||
) | ||
end | ||
end | ||
MOI.set( | ||
model.optimizer, | ||
DiffOpt.ForwardConstraintFunction(), | ||
inner_ci, | ||
MOI.ScalarAffineFunction{T}(terms, cte), | ||
) | ||
end | ||
end | ||
end | ||
end | ||
if model.affine_objective_cache !== nothing | ||
cte = zero(T) | ||
terms = MOI.ScalarAffineTerm{T}[] | ||
pf = model.affine_objective_cache | ||
sizehint!(terms, 0) | ||
if length(pf.p) != 0 | ||
for term in pf.p | ||
p = p_idx(term.variable) | ||
sensitivity = get(model.parameter_input_forward, p, 0.0) | ||
# TODO: check sign | ||
cte += sensitivity * term.coefficient | ||
end | ||
# TODO: if cte != 0 | ||
MOI.set( | ||
model.optimizer, | ||
DiffOpt.ForwardObjectiveFunction(), | ||
inner_ci, | ||
MOI.ScalarAffineFunction{T}(terms, cte), | ||
) | ||
end | ||
elseif model.quadratic_objective_cache !== nothing | ||
cte = zero(T) | ||
terms = MOI.ScalarAffineTerm{T}[] | ||
pf = model.quadratic_objective_cache | ||
sizehint!(terms, length(pf.pv)) | ||
if length(pf.p) != 0 | ||
for term in pf.p | ||
p = p_idx(term.variable) | ||
sensitivity = get(model.parameter_input_forward, p, 0.0) | ||
# TODO: check sign | ||
cte += sensitivity * term.coefficient | ||
end | ||
for term in pf.pp | ||
p_1 = p_idx(term.variable_1) | ||
p_2 = p_idx(term.variable_2) | ||
sensitivity_1 = get(model.parameter_input_forward, p_1, 0.0) | ||
sensitivity_2 = get(model.parameter_input_forward, p_2, 0.0) | ||
cte += sensitivity_1 * sensitivity_2 * term.coefficient | ||
end | ||
# canonicalize? | ||
for term in pf.pv | ||
p = p_idx(term.variable_1) | ||
sensitivity = get(model.parameter_input_forward, p, NaN) | ||
if !isnan(sensitivity) | ||
push!( | ||
terms, | ||
MOI.ScalarAffineTerm{T}( | ||
sensitivity * term.coefficient, | ||
term.variable_2, | ||
), | ||
) | ||
end | ||
end | ||
# TODO: if cte != 0 | ||
MOI.set( | ||
model.optimizer, | ||
DiffOpt.ForwardObjectiveFunction(), | ||
inner_ci, | ||
MOI.ScalarAffineFunction{T}(terms, cte), | ||
) | ||
end | ||
end | ||
DiffOpt.forward_differentiate!(model.optimizer) | ||
return | ||
end | ||
|
||
struct ForwardParameter <: MOI.AbstractVariableAttribute end | ||
|
||
function MOI.set( | ||
model::Optimizer, | ||
::ForwardParameter, | ||
variable::MOI.VariableIndex, | ||
value::Number, | ||
) | ||
if _is_variable(variable) | ||
error("Trying to set a parameter sensitivity for a variable") | ||
end | ||
parameter = p_idx(variable) | ||
model.parameter_input_forward[parameter] = value | ||
return | ||
end | ||
|
||
function MOI.get( | ||
model::Optimizer, | ||
attr::DiffOpt.ForwardVariablePrimal, | ||
variable::MOI.VariableIndex, | ||
) | ||
if _is_parameter(variable) | ||
error("Trying to get a variable sensitivity for a parameter") | ||
end | ||
return MOI.get(model.optimizer, attr, model.variables[variable]) | ||
end | ||
|
||
# reverse mode | ||
|
||
function DiffOpt.reverse_differentiate!(model::Optimizer) | ||
error("Not implemented") | ||
DiffOpt.reverse_differentiate!(model.optimizer) | ||
return | ||
end | ||
|
||
function MOI.set( | ||
model::Optimizer, | ||
attr::DiffOpt.ReverseVariablePrimal, | ||
variable::MOI.VariableIndex, | ||
value::Number, | ||
) | ||
MOI.set(model.optimizer, attr, variable, value) | ||
return | ||
end | ||
|
||
struct ReverseParameter <: MOI.AbstractVariableAttribute end | ||
|
||
function MOI.get( | ||
model::Optimizer, | ||
attr::ReverseParameter, | ||
variable::MOI.VariableIndex, | ||
) | ||
error("Not implemented") | ||
return | ||
end |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why having a different attribute ? We could just use ForwardVariablePrimal for parameters as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can consider that. On the other hand, ForwardVariablePrimal is for output sensitivity, while ForwardParameter is for input sensitivity. Having both different would be good for validation, since we gave up on names like: ForwardOutputVariablePrimal.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can different between input and output depending on whether it's a set or a get. Note that defining a new struct or a new function isn't so natural for an extension, it's more designed to add methods for existing ones. However, it is possible, see the hack in NLopt.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding more details to my previous comment:
I find it strange that:
For parameters:
set
ForwardVariablePrimal
sets an input value that can beget
to check which value was there.While, for actual variables:
set
ForwardVariablePrimal
always errors, andget
ForwardVariablePrimal
only makes sense afterforward_differentiate!
This was the main motivation for the new attribute.
About:
I think I did not understand completely. DiffOpt adds new structs. Also, a few solvers define new structs (like
Gurobi.NumberOfObjectives
,GLPK.CallbackFunction
,COSMO.ADMMIterations
).There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah! now I get it, you mean Julia extensions like
NLoptMathOptInterfaceExt.jl
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I meant Julia extensions :) The workaround in NLopt works quite well but it becomes tricky when you try to include their docstring in the documentation. It was quite hard to make it work in JuliaManifolds/Manopt.jl#264 for instance.
Also, as from the MOI level, these are also variables and the set can be bridged to
EqualTo
, it makes sense to consider it asForwardVariablePrimal
.Actually, what we could do it add
ForwardConstraintSet
that is defined forMOI.Parameter
,MOI.EqualTo
,MOI.LessThan
,MOI.GreaterThan
,MOI.Interval
. I think we worked around it in DiffOpt by using the constant in the function but if you have aVariableIndex
-in-S
then you can't modify the function right ?We could disallow
ForwardConstraintSet
for non-VariableIndex
to avoid having two ways to set the same thing. Even if that's not consistent with theConstraintFunction
/ConstraintSet
attributes, that's backward compatible. Or we can change this and tag v0.5 of DiffOpt.The advantage of this design is that we can implement
ForwardConstraintSet
in the bridge that transformsParameter
toEqualTo
so that the same user code works with both a POI-based solver and a solver using the bridge.