Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[editor] Streaming Outputs #798

Merged
merged 1 commit into from
Jan 6, 2024
Merged

[editor] Streaming Outputs #798

merged 1 commit into from
Jan 6, 2024

Conversation

rholinshead
Copy link
Contributor

@rholinshead rholinshead commented Jan 6, 2024

[editor] Streaming Outputs

Add client-side handling for streaming outputs. We currently derive whether the request should stream by checking settings. Not a huge fan of this since:

  • it couples client with config runtime logic for getting the 'stream' value
  • assumes 'stream' setting is used by the models

This is sufficient for now, I think. Note that I do think it's nice to allow the editor user to toggle between stream/no stream since it will be useful for debugging their model parser

Testing:

  • Error handling: added return HttpResponseWithAIConfig(message="No AIConfig loaded", code=400, aiconfig=None).to_flask_format() in /run with 'stream' set to true and ensured error is shown as notification and run state is stopped on the client
  • Non-streaming model (dall-e) run correctly
  • Streaming works for models with stream setting set to tru/false:
Screen.Recording.2024-01-05.at.10.15.06.PM.mov

Copy link
Contributor

@jonathanlastmileai jonathanlastmileai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

@rholinshead rholinshead merged commit 54de9cc into main Jan 6, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants