Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add an option to provide the sparsity pattern of jacobians and hessians #284

Merged

Conversation

amontoison
Copy link
Member

@amontoison amontoison requested a review from tmigot August 4, 2024 18:23
@jbcaillau
Copy link

@jbcaillau

Thanks @amontoison So, provide:

  • sparse AD Jacobian (forward)
  • sparse AD Hessian (forward and reverse) ?

@amontoison
Copy link
Member Author

@jbcaillau

Thanks @amontoison So, provide:

* sparse AD Jacobian (forward)

* sparse AD Hessian (forward and reverse) ?

@jbcaillau
The sparsity pattern of Jacobians and Hessians is not related to AD but we must store it for the decompression step.
The AD is just used to recover the value of the nnz coefficients and we can use what we want for the backend.
The supported sparse backend are:

  • SparseADJacobian -- Forward (Jacobian);
  • SparseADHessian -- Forward over forward (Hessian);
  • SparseReverseADHessian -- Forward over Reverse (Hessian).

In the future I would like to only support two backends (SparseADJacobian and SparseADHessian) with options for the "mode" (forward, reverse, both, forward over reverse, ...) and the "AD package" (Enzyme.jl, ForwardDiff.jl, ReverseDiff.jl, ...).

Copy link
Contributor

github-actions bot commented Aug 4, 2024

Package name latest stable
CaNNOLeS.jl
DCISolver.jl
DerivativeFreeSolvers.jl
JSOSolvers.jl
NLPModelsIpopt.jl
OptimalControl.jl
OptimizationProblems.jl
Percival.jl
QuadraticModels.jl
SolverBenchmark.jl
SolverTools.jl

@jbcaillau
Copy link

@jbcaillau The sparsity pattern of Jacobians and Hessians is not related to AD but we must store it for the decompression step. The AD is just used to recover the value of the nnz coefficients and we can use what we want for the backend [...]

@amontoison OK. Thought it was more directly related to this request (example / tuto on how to build an NLPModel)
control-toolbox/CTDirect.jl#183

Copy link
Member

@tmigot tmigot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @amontoison ! Just a small improvement, I don't think we need the parametric type anymore.

src/sparse_jacobian.jl Outdated Show resolved Hide resolved
src/sparse_hessian.jl Outdated Show resolved Hide resolved
src/sparse_hessian.jl Outdated Show resolved Hide resolved
@tmigot
Copy link
Member

tmigot commented Aug 4, 2024

After, this PR I can rehabilitate this PR #209 to have such tutorials

@amontoison
Copy link
Member Author

@tmigot Do you think that we can add an option when we create an ADNLPModel / ADNLSModel to provide a sparsity pattern?
An option like pattern_hessian / sp_hessian.

@tmigot
Copy link
Member

tmigot commented Aug 4, 2024

kwargs given to the ADNLPModels constructor are passed to the ADModelBackend which are passed to each backend.
Would there be a way to modify this PR so that it works?
Maybe add the sparsity pattern as kwargs with default of size (0,0) and then avoid calling the function only if size is different something like this?

I am not against adding kwargs, but ADNLPModels has so many constructors, I would love not adding more code if possible.

@amontoison
Copy link
Member Author

Ok, let's not add new keywords and just add an example / tutorial in the documentation to provide the sparsity patterns.

@amontoison amontoison merged commit eb9006c into JuliaSmoothOptimizers:main Aug 5, 2024
35 of 38 checks passed
@amontoison amontoison deleted the manual_sparsity_pattern branch August 5, 2024 01:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants