Replies: 3 comments 2 replies
-
I do not know if there is a method directly from this project, but by bring out the class of the pythorch model in a custom codebase (you only need the load model and predict) you can youse pytorch mobile to create application with Android or iOs or serve a coqi-ai/TTS server and use the api. (I'm newest, sorry in advance if the answare is not so good) |
Beta Was this translation helpful? Give feedback.
-
We do not have ONNX yet, but in our plans. I know @synesthesiam is also interested in discovering more about it. |
Beta Was this translation helpful? Give feedback.
-
As @erogol mentioned, I plan to look at ONNX export. I have it already working elsewhere, and I suspect the process will be straightforward since both our GlowTTS implementations come from the same source. The only "trick" I needed to figure out was storing the inverse with # Inference only
model.eval()
# Do not calcuate jacobians for fast decoding
with torch.no_grad():
model.decoder.store_inverse() |
Beta Was this translation helpful? Give feedback.
-
The GlowTTS implementation is excellent. Great work.
I currently have a Glow-TTS model that I trained on
coqui-ai/TTS
, and I'm looking to utilise it on mobile. However, I've been quite unsuccessful at converting the model to eitherTorchScript
orONNX
. Has anyone had any such success? If so, would they mind helping me out?Thanks
Beta Was this translation helpful? Give feedback.
All reactions