Skip to content

Commit

Permalink
Add tutorial with manual sparsity pattern
Browse files Browse the repository at this point in the history
  • Loading branch information
tmigot committed Dec 21, 2023
1 parent d4c48d2 commit e28aedd
Show file tree
Hide file tree
Showing 2 changed files with 72 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ makedocs(
"Build a hybrid NLPModel" => "mixed.md",
"Support multiple precision" => "generic.md",
"Performance tips" => "performance.md",
"Fine-tune sparse derivatives" => "sparsepattern.md",
"Reference" => "reference.md",
],
)
Expand Down
71 changes: 71 additions & 0 deletions docs/src/sparsepattern.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
# Improve sparse derivatives

In this tutorial, we show a simple trick to dramatically improve the computation of sparse Jacobian and Hessian matrices.

Our test problem is an academic investment control problem:

```math
\begin{aligned}
\min_{u,x} \quad & \int_0^1 (u(t) - 1) x(t) \\
& \dot{x}(t) = gamma * (u(t) * x(t)).
\end{aligned}
```

Using a simple quadrature formula for the objective functional and a forward finite difference for the differential equation, one can obtain a finite-dimensional continuous optimisation problem.
One is implementation is available in the package [`OptimizationProblems.jl`](https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl).

```@example ex1
using ADNLPModels
using OptimizationProblems
using Symbolics
using SparseArrays
n = 10
@btime begin
nlp = OptimizationProblems.ADNLPProblems.controlinvestment(n = n, show_time = true)
end
```

After adding the package `Symbolics.jl`, the `ADNLPModel` will automatically try to prepare AD-backend to compute sparse Jacobian and Hessian.

```@example ex1
using NLPModels
x = sqrt(2) * ones(n)
jac(nlp, x)
```

However, it can be rather costly to determine for a given function the sparsity pattern of the Jacobian and the Lagrangian Hessian matrices.
The good news is that it can be quite easy to have a good approximation of this sparsity pattern dealing with problems like our optimal control investment problem, and problem with differential equations in the constraints in general.

The following example specialize the function `compute_jacobian_sparsity` to manually provide the sparsity pattern.

```@example ex2
using ADNLPModels
using OptimizationProblems
using Symbolics
using SparseArrays
function ADNLPModels.compute_jacobian_sparsity(c!, cx, x0)
# S = Symbolics.jacobian_sparsity(c!, cx, x0)
# return S
return hcat(
spdiagm(0 => ones(Bool, 5), 1 => ones(Bool, 4)),
spdiagm(0 => ones(Bool, 5), 1 => ones(Bool, 4)),
)
end
n = 10
@btime begin
nlp = OptimizationProblems.ADNLPProblems.controlinvestment(n = n, show_time = true)
end
```

A similar Jacobian matrix is obtained at a lower price.

```@example ex2
using NLPModels
x = sqrt(2) * ones(n)
jac(nlp, x)
```

The function `compute_hessian_sparsity(f, nvar, c!, ncon)` does the same for the Lagrangian Hessian.

0 comments on commit e28aedd

Please sign in to comment.