-
Notifications
You must be signed in to change notification settings - Fork 104
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues of getting response from my desktop #118
Comments
My first guess is that you didn't select the model correctly in the config. That's what I encountered anyways. |
I have tried that. When I run now I have this in my configs: I get no response. |
I first encountered the same after switching to the model you tried. yi-coder:latest Now it runs fine in nvim. My config looks like this: { |
Same here. The ollama logs get |
Hi @Glavnokoman , Interesting, can you add Thanks and best regards, |
Having the same issue on one of my laptops. Did also turn on debug = true but no response. Changed LLM from "llama3.1" to "llama3.1:latest" with no effect still no output. The translation window opens it stays empty. debug or not. EDIT |
Hey got an LLM running on my desktop listening on all ports. For some reason I cannot get gen.nvim to give me a response.
The text was updated successfully, but these errors were encountered: