Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug/Assistance] how to use local model to replace gpt3.5? #150

Open
lambda7xx opened this issue Jul 19, 2024 · 2 comments
Open

[Bug/Assistance] how to use local model to replace gpt3.5? #150

lambda7xx opened this issue Jul 19, 2024 · 2 comments
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@lambda7xx
Copy link

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Screenshots or Terminal Copy&Paste
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. Ubuntu 22.04]
  • Python: [e.g. 3.9]

Additional context
Add any other context about the problem here.

@lambda7xx lambda7xx added bug Something isn't working help wanted Extra attention is needed labels Jul 19, 2024
@lambda7xx
Copy link
Author

if I want to use local model like llama3, how to use this model to run the bench?

@TheodorAI
Copy link

if I want to use local model like llama3, how to use this model to run the bench?

Use fastchat!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants