Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run StarCoder for inference on macOS and feasibility without a GPU? #131

Open
code2graph opened this issue Aug 28, 2023 · 0 comments

Comments

@code2graph
Copy link

I'm interested in running StarCoder for inference on my macOS machine, but I have some questions.

Questions:

Library Recommendations: I've come across OpenLLM. Should I be considering OpenLLM for this, or are there other recommended libraries/tools for running StarCoder on macOS?

Feasibility without GPU on Macbook pro with 32GB: Is it feasible to run StarCoder on a macOS machine without a GPU and still achieve reasonable latency during inference? (I understand that "reasonable" can be subjective.)

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant