Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for callbacks on streamed response #322

Open
abdelkd opened this issue Jan 5, 2025 · 0 comments
Open

Add support for callbacks on streamed response #322

abdelkd opened this issue Jan 5, 2025 · 0 comments
Labels
component:js sdk Issue/PR related to JavaScript SDK status:triaged Issue/PR triaged to the corresponding sub-team type:feature request New feature request/enhancement

Comments

@abdelkd
Copy link
Contributor

abdelkd commented Jan 5, 2025

Description of the feature request:

Add callbacks on streamed response to allow developers to easily manage generating streamed chat responses with callbacks such as saving the prompt to database or any server side call.

What problem are you trying to solve with this feature?

This feature is helpful for a good DX. Currently the user will need to manage the response manually. For example I had to implement this function to get a streamed response with callbacks:

export const getStreamedAIResponse = async ({ prompt, onData, onEnd }) => {
  const response = await model.generateContentStream([prompt]);

  let textBuf = '';

  while (true) {
    const { done, value } = await response.stream.next()
    if (done) {
      await onEnd(textBuf);
      return
    }

    if (value) {
      const currentText = value.text()
      onData(value.text())
      textBuf += currentText
    }
  }
}

What this function does is allowing the user to sign a onData and onEnd functions. The server will then call onData on every chunk received from the LLM API and onEnd is called at the end of the generation of the response. This way, It provides the user with a chat-like streamed response without the need to way for the entire response. It also allows the developer to save the response on the end of the generation without relying on any client-side callback.

It would be a good idea to expose the developer with a function like this:

const response = generateStreamedResponse([prompt, {fileData: {fileUri, mimeType}}], {
  onData: (chunk) => {},
  onEnd: (repsonse) => {},
  onError: (error) => {}
})

The returned response is Response which would allow for directly returning the value in frameworks like Next.js and SvelteKit

Any other information you'd like to share?

This could also be beneficial in Express.js and Hono.
I am ready to implement it and file a PR.

@Gunand3043 Gunand3043 added type:feature request New feature request/enhancement status:triaged Issue/PR triaged to the corresponding sub-team component:js sdk Issue/PR related to JavaScript SDK labels Jan 6, 2025
@IvanLH IvanLH assigned IvanLH and unassigned pamorgan and IvanLH Feb 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component:js sdk Issue/PR related to JavaScript SDK status:triaged Issue/PR triaged to the corresponding sub-team type:feature request New feature request/enhancement
Projects
None yet
Development

No branches or pull requests

4 participants