Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Groq model via litellm: 'invalid_api_key' error #485

Closed
emanuelecavalleri opened this issue Nov 29, 2024 · 4 comments · Fixed by #486
Closed

Groq model via litellm: 'invalid_api_key' error #485

emanuelecavalleri opened this issue Nov 29, 2024 · 4 comments · Fixed by #486

Comments

@emanuelecavalleri
Copy link
Collaborator

emanuelecavalleri commented Nov 29, 2024

Hi!

Currently playing around with the new release where you switched out the old model loading process for one driven by litellm (i.e. enable using local models without requiring proxy configuration): it is a great enhancement!!

However, I believe I may have encountered a bug while using Groq as the LLM provider. Steps to Reproduce:

  1. Set the Groq API key via:
    export GROQ_API_KEY="my-groq-api-key"
    or runoak set-apikey -e groq-key my-groq-api-key

  2. Run the extract command:
    ontogpt extract -t template -i text.txt -m groq/llama3-70b-8192

  3. The process fails with the following error:
    Encountered error: <class 'litellm.exceptions.BadRequestError'>, Error: litellm.BadRequestError: GroqException - Error code: 401 - {'error': {'message': 'Invalid API Key', 'type': 'invalid_request_error', 'code': 'invalid_api_key'}} ERROR:ontogpt.clients.llm_client:No response or response is empty.

As expected, the extraction then proceeds using the default GPT model.

I looked into the code at src/ontogpt/clients/llm_client.py. I added a debug statement at line 84 as follows: gist . The print(response) outputted:
ModelResponse(id='chatcmpl-49279d4b-790e-4e38-a1ed-10d50a0a02f1', choices=[Choices(finish_reason='stop', index=0, message=Message(content='Here are the extracted entities in the requested format:\n\nlocationOrganizationDocument: (empty)\n\nlocationLocationDocument: Africa contains Ghana; Africa contains Burkina Faso; Africa contains Ivory Coast; Africa contains Libya\n\npersonLocationDocument: (empty)\n\npersonOrganizationDocument: (empty)\n\nlabel: (no labelled entities in the text)\n\nLet me know if you need any further assistance!', role='assistant', tool_calls=None, function_call=None))], created=1732874318, model='groq/llama3-70b-8192', object='chat.completion', system_fingerprint='fp_753a4aecf6', usage=Usage(completion_tokens=76, prompt_tokens=379, total_tokens=455, completion_tokens_details=None, prompt_tokens_details=None, queue_time=0.00042437799999999956, prompt_time=0.024439082, completion_time=0.217142857, total_time=0.241581939), service_tier=None, x_groq={'id': 'req_01jdvnqcchevdvc5x8wj9jj98s'})

So, it seems to work. Hope this might be helpful!

Emanuele and @andrealosi8

@caufieldjh
Copy link
Member

Interesting - I'll take a look.
I don't have a Groq key so I'll need your help to verify that any fixes help.

@caufieldjh caufieldjh linked a pull request Dec 4, 2024 that will close this issue
@caufieldjh caufieldjh reopened this Dec 4, 2024
@caufieldjh
Copy link
Member

OK, I think it should work now as long as the extract command specifies the model-provider as groq and the API key is set as runoak set-apikey -e groq-key my-groq-api-key - but please let me know if the BadRequestError still appears

@caufieldjh
Copy link
Member

and by "now" I mean with the current repo version of ontogpt (96ccbb0), not the pip release

@emanuelecavalleri
Copy link
Collaborator Author

works now! thanks harry!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants