Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Separate function for direct transcription (ocp to nlp) #73

Merged
merged 23 commits into from
Apr 3, 2024
Merged

Conversation

PierreMartinon
Copy link
Member

@PierreMartinon PierreMartinon commented Mar 20, 2024

TODO:

  • add an export of the NLP problem alone i.e the member docp.nlp (note that we then lose the link to the original OCP, in terms of initial guess and solution)
docp = DirectTranscription(ocp1, grid_size=100, init=init_constant)
nlp = getNLP(docp)
print(docp.nlp.meta.x0)
  • add possibility to pass initial guess to solveDOCP (currently passed to DirectTrancription since initial guess is part of NLPModel). Using for instance a previous solution to generate the initial guess
init_sol = OptimalControlInit(sol)

one can then set the initial guess in the DOCP

setDOCPInit(docp, init_sol)
sol = solveDOCP(docp, print_level=5, tol=1e-12)

or pass it to the solver call (will bypass the initial guess stored in docp)

sol = solveDOCP(docp, init=init_sol, print_level=5, tol=1e-12)

Note: some indicators currently defined in DOCP could be migrated up to the OCP, such as has_free_final_time, has_variable etc, similarly to the existing is_min(). Using these higher level functions is probably cleaner than looking at the internal members of OCP.

Copy link

codecov bot commented Mar 20, 2024

Codecov Report

Attention: Patch coverage is 98.78788% with 4 lines in your changes are missing coverage. Please review.

Project coverage is 97.85%. Comparing base (e63a595) to head (15c12e8).
Report is 2 commits behind head on main.

Files Patch % Lines
src/solve.jl 82.60% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main      #73      +/-   ##
==========================================
- Coverage   99.15%   97.85%   -1.30%     
==========================================
  Files           6        6              
  Lines         471      466       -5     
==========================================
- Hits          467      456      -11     
- Misses          4       10       +6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@jbcaillau jbcaillau changed the title separate function for direct transcription (ocp to nlp) WIP: separate function for direct transcription (ocp to nlp) Mar 22, 2024
@jbcaillau
Copy link
Member

@PierreMartinon @tmigot linked to this issue

@tmigot
Copy link
Contributor

tmigot commented Mar 22, 2024

@PierreMartinon I am just reading the comments, so I am not sure what was blocking, but it is possible to broadcast an initial guess to ADNLPModel to change the initial guess, i.e., something like

nlp = ADNLPModel(...)
nlp.meta.x0 .= ones(n)

or the alternative is to pass an initial guess directly to the solver (it takes nlp.meta.x0 only by default).

@PierreMartinon
Copy link
Member Author

Thanks @tmigot for the clarification ! I guess people usually prefer to pass the initial guess to the solver directly. Both options added.

@jbcaillau
Copy link
Member

thanks @tmigot ; @PierreMartinon please send a message when PR is ready to merge 🤞🏾

@PierreMartinon
Copy link
Member Author

Ok for me, @joseph-gergaud do you have more changes before the merge ?

@jbcaillau jbcaillau merged commit c2af80f into main Apr 3, 2024
6 checks passed
@jbcaillau jbcaillau deleted the export branch April 3, 2024 21:13
@jbcaillau jbcaillau changed the title WIP: separate function for direct transcription (ocp to nlp) Separate function for direct transcription (ocp to nlp) Apr 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants