-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/83694 streaming markdown handling #106
Conversation
…ng-markdown-handling
content={text} | ||
className={`webchat-${classType}-template-header`} | ||
id={webchatButtonTemplateTextId} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why was this removed? This could potentially break some tests in the webchat repo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you are right, I will revert this!
When I have two say nodes one after the other and if the first one has a longer text than the second one, I would expect the streaming animation of the first one to be completed if the next one is complete. See the below recording: The third message has stopped streaming, but the second message is still streaming which I found out after the scrollbar slightly flickered. |
Code looks clean! Great work on the animation :) |
…ng-markdown-handling
Yes I have seen this as well, I added it originally just for streamed messages, but then we decided to also enable it for say outputs. So you would see it if you have several say outputs behind each other, especially if there are longer ones. I think if someone sends outputs like that, they should not use this setting to be honest. But we can add some more logic to handle the animation message after message in the future if requested. |
fixed by adding a plugin for tables and some other markdown (https://github.com/remarkjs/react-markdown#use-a-plugin). please remove the triple backticks in your output so that is rendered |
I just released v0.39.0 of chat-components. Chat-components should be updated to 0.40.0 |
…ng-markdown-handling
This PR adds support for markdown rendering and streaming messages (animation + message chunk collation) to the chat components. There is a new view on the demo page as well.
For testing, checkout the Webchat branch of the same name with the PR linked here: (Cognigy/Webchat#55)
npm ci && npm pack
in this repo,then
npm i ../chat-components/cognigy-chat-components-0.38.0.tgz && npm run dev
in the Webchat repo (correct path if it is somewhere else for you)In the local Webchat index.html you need to enable
behavior: {
collateStreamedOutputs: true,
progressiveStreaming: true,
renderMarkdown: false,
}
Please test directly with the related PR in cognigy AI (coming up)