Skip to content

Issue when differentiating a semi-definite program #293

@frapac

Description

@frapac

I got a weird issue if I try to compute the sensitivity from the following SDP:

using JuMP
using DiffOpt
using Clarabel

model = Model(() -> DiffOpt.diff_optimizer(Clarabel.Optimizer))
@variable(model , t >= 0)
cons = @constraint(model , [t 1.0 ; 1.0 t] in PSDCone())
@objective(model, Min , t)
optimize!(model)

MOI.set(model, DiffOpt.ForwardConstraintFunction(), cons, [1.0 0.0; 0.0 1.0])
DiffOpt.forward_differentiate!(model) 

The code returns the following error saying the optimizer has not been called (which is not the case here as we are calling optimize! just before computing the sensitivity):

ERROR: LoadError: Trying to compute the forward differentiation on a model with termination status OPTIMIZE_NOT_CALLED
Stacktrace:
 [1] error(s::String)
   @ Base ./error.jl:35
 [2] forward_differentiate!(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{…}})
   @ DiffOpt ~/.julia/packages/DiffOpt/FJjnq/src/moi_wrapper.jl:553
 [3] forward_differentiate!
   @ ~/.julia/packages/DiffOpt/FJjnq/src/jump_moi_overloads.jl:393 [inlined]
 [4] forward_differentiate!(model::MathOptInterface.Utilities.CachingOptimizer{…})
   @ DiffOpt ~/.julia/packages/DiffOpt/FJjnq/src/jump_moi_overloads.jl:378
 [5] forward_differentiate!(model::Model)
   @ DiffOpt ~/.julia/packages/DiffOpt/FJjnq/src/jump_moi_overloads.jl:363

If I look at the unit-tests implemented in DiffOpt, I observe that the error can be resolved if use instead the following hack:

using JuMP
using DiffOpt
using Clarabel

model = Model(() -> DiffOpt.diff_optimizer(Clarabel.Optimizer))
@variable(model , t >= 0)
cons = @constraint(model , [t 1.0 ; 1.0 t] in PSDCone())
@objective(model, Min , t)
optimize!(model)

h = MOI.VectorAffineFunction{Float64}(
            MOI.ScalarAffineTerm{Float64}[],
            ones(3),
)
MOI.set(model, DiffOpt.ForwardConstraintFunction(), cons, h)
DiffOpt.forward_differentiate!(model)

but even when doing that, I got a second error saying the dual of a PSD cone does not exist (I guess the error is somewhat related to MathOptSetDistances):

ERROR: LoadError: Dual of `PositiveSemidefiniteConeSquare` is not defined in MathOptInterface.
For more details see the comments in `src/Bridges/Constraint/bridges/square.jl`.
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:35
  [2] dual_set(::MathOptInterface.PositiveSemidefiniteConeSquare)
    @ MathOptInterface ~/.julia/packages/MathOptInterface/eRhCr/src/sets.jl:1296
  [3] (::DiffOpt.var"#14#15"{…})(ci::MathOptInterface.ConstraintIndex{…}, r::UnitRange{…})
    @ DiffOpt ~/.julia/packages/DiffOpt/FJjnq/src/diff_opt.jl:512
  [4] _map_rows!(f::DiffOpt.var"#14#15"{…}, x::Vector{…}, model::MathOptInterface.Utilities.GenericModel{…}, cones::DiffOpt.ProductOfSets{…}, ::Type{…}, ::Type{…}, map_mode::DiffOpt.Nested{…}, k::Int64)
    @ DiffOpt ~/.julia/packages/DiffOpt/FJjnq/src/diff_opt.jl:550
  [5] map_rows(f::Function, model::MathOptInterface.Utilities.GenericModel{…}, cones::DiffOpt.ProductOfSets{…}, map_mode::DiffOpt.Nested{…})
    @ DiffOpt ~/.julia/packages/DiffOpt/FJjnq/src/diff_opt.jl:589
  [6] Dπ
    @ ~/.julia/packages/DiffOpt/FJjnq/src/diff_opt.jl:510 [inlined]
  [7] _gradient_cache(model::DiffOpt.ConicProgram.Model)
    @ DiffOpt.ConicProgram ~/.julia/packages/DiffOpt/FJjnq/src/ConicProgram/ConicProgram.jl:225

Do you have any clue on how to resolve this issue? If I am not using DiffOpt.jl correctly, I would be happy to write a new paragraph in the documentation that explains how to compute the sensitivity from a PSDCone.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions