Skip to content
This repository has been archived by the owner on Jan 27, 2024. It is now read-only.

Serve pretrained model as API #11

Closed
masus04 opened this issue Aug 4, 2021 · 5 comments
Closed

Serve pretrained model as API #11

masus04 opened this issue Aug 4, 2021 · 5 comments

Comments

@masus04
Copy link

masus04 commented Aug 4, 2021

I am trying to start a server using a pretrained model. Am I correct in assuming the server.py is used to start an inference server?

As far as I understand the available models are configured in the /available_models/conf.json file. Is there any documentation on the options available for this config file?

@goncalomcorreia
Copy link
Collaborator

Hi, this is an old fork of OpenNMT-py. This seems like an issue specific to that repository. I never used the server.py script so I don't know if the changes in this fork have an effect on it in any way

@masus04
Copy link
Author

masus04 commented Aug 6, 2021

Ok, thank you.
Is there any other existing method to serve a pretrained model?

@goncalomcorreia
Copy link
Collaborator

Not that I know of.

@goncalomcorreia
Copy link
Collaborator

You can try to see the documentation for server.py in the original OpenNMT-py repository. If you find the issue, I'll gladly accept a PR :)

@masus04
Copy link
Author

masus04 commented Aug 6, 2021

Alright, will check, thank you 👍

@masus04 masus04 changed the title Documentation: Starting the server.py Serve pretrained model as API Aug 6, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants