-
-
Notifications
You must be signed in to change notification settings - Fork 350
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Include file contents in llm chat
#682
Comments
Similarly I'd love to be able to start a chat with a corpus of input (text files, images), plus an initial prompt, then continue the conversation, but
|
Something like @ instead of !file would be cleaner I think GitHub's issue editor uses One way to do this might be an additional hook: prepare-prompt, which allows plugins to modify a prompt before it is executed by the model. The context could/should include file attachments, model id etc. Something like: context:
attachments:
- { filename, mimeType, handle }
model-id:
model-name:
prompt:
conversation id: #if continuing an existing conversation. You'd need handle & mimetype for when the input is not a file on disk, but a pipe. Then people can make and activate their own plugins like:
|
working on something like this on #734 |
@Grynn I like this idea
|
Curious, is there a way to inject the contents of a file while in the middle of a chat? If not, I'd love to be able to do something like
!file foo.txt
The text was updated successfully, but these errors were encountered: