diff --git a/docs/make.jl b/docs/make.jl
index 4478a2a4..66941be5 100644
--- a/docs/make.jl
+++ b/docs/make.jl
@@ -18,6 +18,7 @@ makedocs(
     "Build a hybrid NLPModel" => "mixed.md",
     "Support multiple precision" => "generic.md",
     "Performance tips" => "performance.md",
+    "Fine-tune sparse derivatives" => "sparsepattern.md",
     "Reference" => "reference.md",
   ],
 )
diff --git a/docs/src/sparsepattern.md b/docs/src/sparsepattern.md
new file mode 100644
index 00000000..9ac5046c
--- /dev/null
+++ b/docs/src/sparsepattern.md
@@ -0,0 +1,71 @@
+# Improve sparse derivatives
+
+In this tutorial, we show a simple trick to dramatically improve the computation of sparse Jacobian and Hessian matrices.
+
+Our test problem is an academic investment control problem:
+
+```math
+\begin{aligned}
+\min_{u,x} \quad & \int_0^1 (u(t) - 1) x(t) \\
+& \dot{x}(t) = gamma * (u(t) * x(t)).
+\end{aligned}
+```
+
+Using a simple quadrature formula for the objective functional and a forward finite difference for the differential equation, one can obtain a finite-dimensional continuous optimisation problem.
+One is implementation is available in the package [`OptimizationProblems.jl`](https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl).
+
+```@example ex1
+using ADNLPModels
+using OptimizationProblems
+using Symbolics
+using SparseArrays
+
+n = 10
+@btime begin
+  nlp = OptimizationProblems.ADNLPProblems.controlinvestment(n = n, show_time = true)
+end
+```
+
+After adding the package `Symbolics.jl`, the `ADNLPModel` will automatically try to prepare AD-backend to compute sparse Jacobian and Hessian.
+
+```@example ex1
+using NLPModels
+x = sqrt(2) * ones(n)
+jac(nlp, x)
+```
+
+However, it can be rather costly to determine for a given function the sparsity pattern of the Jacobian and the Lagrangian Hessian matrices.
+The good news is that it can be quite easy to have a good approximation of this sparsity pattern dealing with problems like our optimal control investment problem, and problem with differential equations in the constraints in general.
+
+The following example specialize the function `compute_jacobian_sparsity` to manually provide the sparsity pattern.
+
+```@example ex2
+using ADNLPModels
+using OptimizationProblems
+using Symbolics
+using SparseArrays
+
+function ADNLPModels.compute_jacobian_sparsity(c!, cx, x0)
+    # S = Symbolics.jacobian_sparsity(c!, cx, x0)
+    # return S
+    return hcat(
+        spdiagm(0 => ones(Bool, 5), 1 => ones(Bool, 4)),
+        spdiagm(0 => ones(Bool, 5), 1 => ones(Bool, 4)),
+      )
+end
+
+n = 10
+@btime begin
+  nlp = OptimizationProblems.ADNLPProblems.controlinvestment(n = n, show_time = true)
+end
+```
+
+A similar Jacobian matrix is obtained at a lower price.
+
+```@example ex2
+using NLPModels
+x = sqrt(2) * ones(n)
+jac(nlp, x)
+```
+
+The function `compute_hessian_sparsity(f, nvar, c!, ncon)` does the same for the Lagrangian Hessian.