You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/controller/nonlinmpc.jl
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -550,8 +550,8 @@ This method is really intricate and I'm not proud of it. That's because of 3 ele
550
550
and as efficient as possible. All the function outputs and derivatives are cached and
551
551
updated in-place if required to use the efficient [`value_and_jacobian!`](@extref DifferentiationInterface DifferentiationInterface.value_and_jacobian!).
552
552
- The `JuMP` NLP syntax forces splatting for the decision variable, which implies use
553
-
of `Vararg{T,N}` (see the (performance tip)[@extref Julia Be-aware-of-when-Julia-avoids-specializing]
554
-
) and memoization to avoid redundant computations. This is already complex, but it's even
553
+
of `Vararg{T,N}` (see the [performance tip](@extref Julia Be-aware-of-when-Julia-avoids-specializing))
554
+
and memoization to avoid redundant computations. This is already complex, but it's even
555
555
worse knowing that most automatic differentiation tools do not support splatting.
556
556
- The signature of gradient and hessian functions is not the same for univariate (`nZ̃ == 1`)
557
557
and multivariate (`nZ̃ > 1`) operators in `JuMP`. Both must be defined.
0 commit comments