Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat with Video #23

Merged
merged 8 commits into from
Dec 14, 2023
Merged

Chat with Video #23

merged 8 commits into from
Dec 14, 2023

Conversation

movchan74
Copy link
Contributor

Summary:
This Pull Request delivers a comprehensive upgrade for the chat with video use case, incorporating new features like streaming video indexing, chat support in vLLM deployment, and chat with video endpoint.

Key Changes:

  1. Full Pipeline for Chat with Video Usecase: A complete pipeline has been implemented to support chat with video use case.

  2. Streaming Video Indexing: A new endpoint has been added for indexing videos, capable of extracting and streaming transcripts and captions. This feature ensures real-time processing and presentation of video content.

  3. File Storage for Transcripts and Captions: Initially, generated transcripts, captions, and combined timelines are stored as files. Plans are underway to migrate these to a database for better management and retrieval.

  4. Chat Endpoints in vLLM Deployment: Integration of chat endpoints into the vLLM deployment, supporting dialog (a list of messages) as input.

  5. Customizable Chat Templates: Added support for chat templates with the flexibility to provide custom templates when the LLM's tokenizer lacks a chat template.

  6. LLM-based Chat for Video Content: Introducing an endpoint that leverages LLM to facilitate chat based on timelines generated from video transcripts and captions.

  7. Example Notebook for Video Chat Usecase: A notebook has been included to demonstrate the usage and capabilities of the new chat with video feature.

  8. Testing for Chat Templates and Methods: Tests have been added for the newly implemented chat templates and chat functionalities in the vLLM deployment.

Copy link
Collaborator

@evanderiel evanderiel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay apart from the question about MediaIdException

aana/api/app.py Outdated Show resolved Hide resolved
aana/utils/video.py Outdated Show resolved Hide resolved
@movchan74 movchan74 requested a review from evanderiel December 14, 2023 11:05
@movchan74 movchan74 merged commit a597569 into main Dec 14, 2023
2 checks passed
@movchan74 movchan74 deleted the chat_with_video branch December 14, 2023 13:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants