-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/83694 streaming markdown handling #55
base: main
Are you sure you want to change the base?
Conversation
…ng-markdown-handling
The chat does not auto scroll to bottom anymore. Was the scroll button implemented as a work-around for this? Update: The scroll behavior is not consistent. It behaves differently every time |
Scrolling works as expected with the new setting :) |
When using collated streamed messages and leaving the window, you get a sound for every incoming "chunk". We should probably only emit one chunk per "message ID". |
A marker on the webchat shows the number of missed messages. this marker is never reset. It remains on whatever number was reached and only ever increases. |
this is a general regression and should be handled separately. I forwarded this to the PO and he created a ticket https://cognigy.visualstudio.com/Boron/_workitems/edit/91476 |
fixed |
Fixed:
|
Success criteria
Please describe what should be possible after this change. List all individual items on a separate line.
This PR adds support for markdown rendering and streaming messages (animation + message chunk collation) to the chat components. There is a new view on the demo page as well.
How to test
Please describe the individual steps on how a peer can test your change.
For testing, checkout the chat-components branch of the same name with the PR linked here: Cognigy/chat-components#106
npm ci && npm pack
in chat-components repo,then
npm i ../chat-components/cognigy-chat-components-0.38.0.tgz && npm run dev
in (this) Webchat repo (correct path if it is somewhere else for you)In the local Webchat index.html you need to enable
behavior: {
collateStreamedOutputs: true,
progressiveStreaming: true,
renderMarkdown: false,
}
or test the settings in the Endpoint settings
Please test directly with the related PR in cognigy AI (coming up)
Security
Additional considerations
Documentation Considerations
These are hints for the documentation team to help write the docs.