Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/onnx support #2620

Open
wants to merge 16 commits into
base: master
Choose a base branch
from
Open

Feat/onnx support #2620

wants to merge 16 commits into from

Conversation

madtoinou
Copy link
Collaborator

@madtoinou madtoinou commented Dec 17, 2024

Checklist before merging this PR:

  • Mentioned all issues that this PR fixes or addresses.
  • Summarized the updates of this PR under Summary.
  • Added an entry under Unreleased in the Changelog.

Fixes #553, fixes #2617.

Summary

  • Small refactoring of the _process_input_batch and _get_batch_prediction() methods to make the behavior uniform across PLModules
  • Added to_onnx method to the TorchForecastingModel class, taking care of creating the input_names and input_sample based on the model attributes
  • Added a section in the User Guide to show how to export and load the model to generate forecasts

Other Information

The consistency of the forecasts was tested in a notebook for all the possible configurations, I did not implement unit-test as onnx should remain an optional dependency. Let me know what you think.

Copy link

codecov bot commented Dec 17, 2024

Codecov Report

Attention: Patch coverage is 39.39394% with 20 lines in your changes missing coverage. Please review.

Project coverage is 94.06%. Comparing base (c48521c) to head (c964e7a).

Files with missing lines Patch % Lines
...arts/models/forecasting/torch_forecasting_model.py 5.55% 17 Missing ⚠️
darts/models/forecasting/pl_forecasting_module.py 75.00% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #2620      +/-   ##
==========================================
- Coverage   94.23%   94.06%   -0.18%     
==========================================
  Files         141      141              
  Lines       15509    15534      +25     
==========================================
- Hits        14615    14612       -3     
- Misses        894      922      +28     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Collaborator

@dennisbader dennisbader left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very cool feature, thanks a lot @madtoinou 🚀 Looks like we're close already to merge :)

I had a couple of suggestions regarding the example and the export logic, where I see potential to simplify things further.

CHANGELOG.md Show resolved Hide resolved
docs/userguide/torch_forecasting_models.md Show resolved Hide resolved
@@ -350,6 +351,87 @@ model_finetune = SomeTorchForecastingModel(..., # use identical parameters & va
model_finetune.load_weights("/your/path/to/save/model.pt")
```

#### Exporting model to ONNX format for inference
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reading this I think again about how nice it would be if all our Datasets returned feature / target arrays at fixed positions (e.g. all return a tuple of past target, past cov, historic future cov, future cov, static cov, ... even if they do not support all covariate types) :D

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, definitely, it would make a lot of things more intuitive and easier to tweak for users.

docs/userguide/torch_forecasting_models.md Show resolved Hide resolved
docs/userguide/torch_forecasting_models.md Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: In review
Development

Successfully merging this pull request may close these issues.

How do I export a darts' TCNModel to ONNX? ONNX Support
2 participants