-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add derivative check for Jacobian of residual #85
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #85 +/- ##
=======================================
Coverage 92.54% 92.54%
=======================================
Files 5 5
Lines 295 295
=======================================
Hits 273 273
Misses 22 22 ☔ View full report in Codecov by Sentry. |
@amontoison the issue here is that the jacobian of the residual is most likely not correct so the test won't pass :s |
Is it possible to isolate which components are wrong? |
The Jacobian was computed explicitly here: https://www.gerad.ca/en/papers/G-2020-42. That should be what’s implemented. |
@amontoison Could you have your test print the errors? |
@tmigot You opened the PR and know better than me what is implemented in |
You can compare the obtained result with auto diff eventually ? https://jso.dev/ADNLPModels.jl/dev/mixed/ you can build an ADNLSModel from a BundleAdjustmentModel. jacobian_residual_check returns a |
The error will be nonzero, because it's finite differences (except in special cases), so it's the magnitude of the error that we should look at. |
I checked the Jacobians with ADNLPModels.jl and it's only correct when we use |
JuliaSmoothOptimizers/NLPModelsTest.jl#101