Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generic Open AI API compatible provider. (Deepseek, Helicone, LiteLLM etc) #885

Open
242816 opened this issue Jan 29, 2025 · 17 comments · May be fixed by #1349
Open

Generic Open AI API compatible provider. (Deepseek, Helicone, LiteLLM etc) #885

242816 opened this issue Jan 29, 2025 · 17 comments · May be fixed by #1349
Labels
enhancement New feature or request help wanted Great issue for non-Block contributors

Comments

@242816
Copy link

242816 commented Jan 29, 2025

Most inference providers support the Open AI API.

So this morning I wanted to use Helicone to track goose calls.

With Aider I would use this https://aider.chat/docs/llms/openai-compat.html

I tried to use the existing open ai provider https://github.com/block/goose/blob/main/crates/goose/src/providers/openai.rs

This didn't work as it's too tied into Open AI.

So a generic provider should allow me to set the HOST, MODEL and KEY to any provider I want.

Requirements

A new provider with the ability to set the following

OPENAI_API_BASE=
OPENAI_API_KEY=
OPENAI_API_MODEL=

Note the BASE needs to be everything up to the /v1 i.e. https://oai.helicione.ai/324324-234234-24324/v1

@willemsFEB
Copy link

Second this, I'm trying to use Perplexity AI which has Open AI compatible API keys, but I'm not able to configure it

@michaelneale
Copy link
Collaborator

you can set export GOOST_HOST=... in ~/.zshrc for now, until the UI has support for it. Also hard to know how well things support the openai api (can't just be chat)

@michaelneale
Copy link
Collaborator

@willemsFEB @242816 would it make sense to have openai, but then a seperate "openai compatible" provider (seperate choices?) - the latter will ask for more config, thoughts?

@digitalbuddha
Copy link

That makes sense, you may also want to add things like azure api version there as an optional parang

@dailydaniel
Copy link

+1 for openai like compatible provider in UI

@gzuuus
Copy link

gzuuus commented Jan 29, 2025

+1 for openai compatible API. I think it would make more sense to have separate configurations for openai and openai-compatible APIs. This way, you can have more control over the API version of the OpenAI-compatible provider, which may be necessary to work around edge cases of mismatching API versions.

@VonLuderitz
Copy link

+1 for openai like compatible provider in UI 🙏

@dmlond
Copy link

dmlond commented Jan 29, 2025

+1 for openai like compatible provider in UI. Our institution is starting to evaluate LiteLLM, which uses an openai compatible api with its own API Key. This will likely become more prevalent as businesses like LiteLLM spring up to offer cloud and on-prem ways to host opesource models.

@salman1993 salman1993 added the enhancement New feature or request label Jan 29, 2025
@willemsFEB
Copy link

@michaelneale thanks for the suggestion, so I've set OPENAI_HOST=https://api.perplexity.ai (you mention GOOST_HOST but that wouldn't that be for Google Gemini API?)

I'm able to get further, however, now goose is appending v1/chat/completions to the end of the URL, which is invalid for Perplexity, as it's expects https://api.perplexity.ai/chat/completions, so without the v1

Any suggestions how I can bypass this?

@michaelneale
Copy link
Collaborator

not at the moment - will need to enhance things to have that more generic thing (which I think is a good idea) - if you are able to - can clone and run just run-ui - you could temporarily change the openai.rs to do what you want to see how it works in the meantime

@242816
Copy link
Author

242816 commented Feb 1, 2025

Is it possible to get goose to add this feature?

A prompt something like...

Using the existing `crates/goose/src/providers/openai.rs` 
create a new provider called `openai_compatible.rs`. 

This should have the following features.

- a env OPENAI_API_BASE which sets the base URL.
- OPENAI_API_KEY which is passed in a a Bearer
- OPENAI_API_MODEL which will hold the model name.

Or is it more complex than that?

@baxen
Copy link
Collaborator

baxen commented Feb 7, 2025

Yeah @242816 that should work as long as the host supports all the openai features including tools. Contribution welcome but we will get started on this soon if no one from this thread has tried it out yet.

@salman1993
Copy link
Collaborator

salman1993 commented Feb 10, 2025

there seems to be a lot of interest in providing an Open-AI compatible provider 🙌🏼 if anyone is interested, this would be a great contribution. or else, we will hopefully get to this in a couple weeks and update here.
https://github.com/block/goose/blob/main/CONTRIBUTING.md

it should be similar to these providers:

@aceilort
Copy link

Great, can't wait to use Goose with LM Studio. :)

@crabsinger
Copy link

I definitely want to use Helicone to observe and iterate on goose's LLM interactions. I think this is required for that to happen, but if there's another way to change host and set an extra header, do let me know!

@ICeZer0
Copy link

ICeZer0 commented Feb 16, 2025

Since Goose does not support LM-studio as an LLM provider I built an Ollama proxy to convert your queries to OpenAI. Its working on MLX models too.

Check it out, hope it helps!
https://github.com/Embedded-Nature/ollama-proxy/

@fblissjr
Copy link

fwiw - I added mlx-lm as a provider. it's barebones but it works.

https://github.com/fblissjr/goose-mlx

goose configure & session:

Image

mlx_lm.server log:

Image

@AnthonyRonning AnthonyRonning linked a pull request Feb 22, 2025 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Great issue for non-Block contributors
Projects
None yet
Development

Successfully merging a pull request may close this issue.