Skip to content
This repository has been archived by the owner on Jan 21, 2025. It is now read-only.

Mesh-tf model conversion to onnx? #368

Open
b-analyst opened this issue Nov 14, 2021 · 2 comments
Open

Mesh-tf model conversion to onnx? #368

b-analyst opened this issue Nov 14, 2021 · 2 comments

Comments

@b-analyst
Copy link

Hi, I've been trying to deploy an mtf model to the NVIDIA Triton Inference Server by converting the SavedModel (output of model.export()) to an onnx file with no luck. I've been receiving several errors, but the main recurring one seems to be regarding the lack of a registered SentencePieceOps. Is there a tutorial for an mtf model deployment available anywhere?

@WingsOfPanda
Copy link

hi @b-analyst , I am just wondering that are you working on image models like u-net or language models like Bert? thx!

@b-analyst
Copy link
Author

Hi! Thank you for replying. I am using a language model - T5 to be specific. I've been using the mtf model implementation from Google-research's GitHub. I managed to convert it to a Pytorch model, and I've been successful at converting to onnx.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants