Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Gemini with prompt caching fails #4975

Closed
1 task done
enyst opened this issue Nov 13, 2024 · 4 comments
Closed
1 task done

[Bug]: Gemini with prompt caching fails #4975

enyst opened this issue Nov 13, 2024 · 4 comments
Assignees
Labels
bug Something isn't working severity:medium Affecting multiple users Stale Inactive for 30 days

Comments

@enyst
Copy link
Collaborator

enyst commented Nov 13, 2024

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

Gemini with prompt caching fails with an exception:

litellm.APIConnectionError: GeminiException - Gemini Context Caching only supports 1 message/block of continuous messages. Your idx, messages were - [(0, {'content': [{'type': 'text', 'text': 'A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed answers to the user\'s questions.\n\n[1] The assistant can use a Python environment with <execute_ipython>...

Gemini is reported by liteLLM to support prompt caching, along with Claude, GPTs, and Deepseek. Unlike those though, this isn't automatic (ignoring silently the cache attribute), nor compatible with Anthropic.

OpenHands Installation

Development workflow

OpenHands Version

No response

Operating System

None

Logs, Errors, Screenshots, and Additional Context

No response

@enyst enyst added the bug Something isn't working label Nov 13, 2024
@enyst enyst self-assigned this Nov 13, 2024
@mamoodi mamoodi added the severity:medium Affecting multiple users label Nov 13, 2024
@enyst
Copy link
Collaborator Author

enyst commented Nov 14, 2024

This is not failing on the current main, because we again changed how we do this in the agent: now we only use prompt caching for Anthropic.

Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 30 days label Dec 19, 2024
@enyst enyst removed the Stale Inactive for 30 days label Dec 21, 2024
Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 30 days label Jan 26, 2025
Copy link
Contributor

github-actions bot commented Feb 2, 2025

This issue was closed because it has been stalled for over 30 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working severity:medium Affecting multiple users Stale Inactive for 30 days
Projects
None yet
Development

No branches or pull requests

2 participants