-
Notifications
You must be signed in to change notification settings - Fork 608
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TensorRT Conversion of KeepTrack #425
Comments
Hi. There is a related issue. I'm glad if it helps you. |
Hi, I'm currently trying to export object tracking models to onnx format and I may not immediately help you but I can give you some insight on what to do: Basically the steps to export a model to onnx format for inference is simply provide torch.onnx.export (or torch.onnx.dynamo_export) an instance of the model and a dummy input. It will trace the model and capture a static computational graph. You could also use torch.onnx.dynamo_export to do same thing but keeping the dynamic nature of the model. You can look into the details here.
The only available code I found in object detection (sadly not in object tracking) that actually exports recent models to onnx format was made by open-mmlab: mmdeploy Feel free me to correct me if I'm wrong ! I will also try to write some code to export the TaMOS model so I will try to follow up in a few days. |
any progress now? i'm also want to export the model to onnx in order to inference on cpu with c++, but i found it's really difficult to do this. |
I don't know if this will still help you but I worked on that a few months ago but from what I remember, I gave up on trying to export pytracking models to onnx because of the way they were coded which made it extremely difficult. |
I am attempting to convert the pretrained weights of the KeepTrack model to TensorRT for inference. As I am new to TensorRT and ONNX, I would greatly appreciate any guidance or suggestions on how to successfully complete this conversion process.
The text was updated successfully, but these errors were encountered: