Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for string message in Bedrock textgen #1291

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

jonminkin97
Copy link
Contributor

Description

Enable llm-textgen using Bedrock to accept a string message as well as a list of chat messages.

Issues

n/a

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)

Dependencies

n/a

Tests

Build and run opea/llm-textgen container. Verify that existing chat format and new string message format both work.

# Existing functionality
curl http://${host_ip}:9009/v1/chat/completions \
  -X POST \
  -d '{"model": "us.anthropic.claude-3-5-haiku-20241022-v1:0", "messages": [{"role": "user", "content": "What is Deep Learning?"}], "max_tokens":17}' \
  -H 'Content-Type: application/json'
# New functionality
curl http://${host_ip}:9009/v1/chat/completions \
  -X POST \
  -d '{"model": "us.anthropic.claude-3-5-haiku-20241022-v1:0", "messages": "What is Deep Learning?", "max_tokens":17}' \
  -H 'Content-Type: application/json'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants