Replies: 1 comment
-
I solved this by implementing something similar to what's done in
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I love the built-in tool calling in the AI SDK, it works great; however, I currently need to be able to handle tool calls on my own, and pass through the raw
tool_calls
andtool
messages to the chat completion, havinggenerateText()
leave them alone. Essentially, I need to know how to format my tool call(s) and result(s) in a format that will mimic whatgenerateText()
does internally with multi-step calling.If I was doing a raw chat completion with OpenAI, I'd structure my tool call/result like this:
However, adding such
messages
togenerateText()
produces type errors:How should I be formatting them so that the completed call + result messages are part of what the LLM receives?
Beta Was this translation helpful? Give feedback.
All reactions