Skip to content

Commit

Permalink
Added exportFormat parameter to LLMChatView (#48)
Browse files Browse the repository at this point in the history
# LLMChatView exportFormat parameter

## ♻️ Current situation & Problem
Currently, the LLMChatView creates an instance of the ChatView by
passing .pdf as the exportFormat. The result is that there is no way to
change the export format or disable exporting entirely.

## ⚙️ Release Notes 
Instead of passing .pdf to the ChatView, LLMChatView now has an optional
parameter exportFormat that is passed to the ChatView. By default, it is
.pdf (so the .init signature need not change), but can now take any of
.pdf, .json, .text, or .none. If exportFormat is .none, no export button
will appear in the toolbar.


## 📚 Documentation
The LLMChatView/init now has an optional new signature where the user
passes in a value for exportFormat.
```swift
struct LLMChatTestView: View {
      // Use the convenience property wrapper to instantiate the `LLMMockSession`
     @LLMSessionProvider(schema: LLMMockSchema()) var llm: LLMMockSession
     @State var muted = true

     var body: some View {
         LLMChatView(session: $llm, exportFormat: .none)
             .speak(llm.context, muted: muted)
             .speechToolbarButton(muted: $muted)
     }
 }
``` 
Alternatively, the exportFormat parameter may be omitted, in which case
the LLMChatView will default to .pdf.


## ✅ Testing
Export functionality is rigorously tested as part of the UI testing for
the Spezi ChatView.

## 📝 Code of Conduct & Contributing Guidelines 

By submitting creating this pull request, you agree to follow our [Code
of
Conduct](https://github.com/StanfordSpezi/.github/blob/main/CODE_OF_CONDUCT.md)
and [Contributing
Guidelines](https://github.com/StanfordSpezi/.github/blob/main/CONTRIBUTING.md):
- [X] I agree to follow the [Code of
Conduct](https://github.com/StanfordSpezi/.github/blob/main/CODE_OF_CONDUCT.md)
and [Contributing
Guidelines](https://github.com/StanfordSpezi/.github/blob/main/CONTRIBUTING.md).

---------

Co-authored-by: Philipp Zagar <zagar@stanford.edu>
Co-authored-by: Paul Schmiedmayer <PSchmiedmayer@users.noreply.github.com>
3 people authored Mar 22, 2024
1 parent d6819a1 commit dc37b91
Showing 3 changed files with 29 additions and 11 deletions.
6 changes: 4 additions & 2 deletions Sources/SpeziLLM/SpeziLLM.docc/SpeziLLM.md
Original file line number Diff line number Diff line change
@@ -158,13 +158,15 @@ struct LLMDemoView: View {

### LLM Chat View

The ``LLMChatView`` and ``LLMChatViewSchema`` present a basic chat views that enables users to chat with a Spezi LLM in a typical chat-like fashion. The input can be either typed out via the iOS keyboard or provided as voice input and transcribed into written text.
The ``LLMChatView`` and ``LLMChatViewSchema`` present basic chat views that enable users to chat with a Spezi LLM in a typical chat-like fashion. The input can be either typed out via the iOS keyboard or provided as voice input and transcribed into written text.
The ``LLMChatViewSchema`` takes an ``LLMSchema`` instance to define which LLM in what configuration should be used for the text inference.
The ``LLMChatView`` is passed an ``LLMSession`` that represents the LLM in execution containing state and context.
The ``LLMChatView`` is passed an ``LLMSession`` that represents the LLM in execution containing state and context, and an optional `ChatView/ChatExportFormat` that defines the format of the to-be-exported `SpeziChat/Chat` (can be any of `.pdf`, `.text`, `.json`).

> Tip: The ``LLMChatView`` and ``LLMChatViewSchema`` build on top of the [SpeziChat package](https://swiftpackageindex.com/stanfordspezi/spezichat/documentation).
For more details, please refer to the DocC documentation of the [`ChatView`](https://swiftpackageindex.com/stanfordspezi/spezichat/documentation/spezichat/chatview).

> Tip: By default, the ``LLMChatView`` presents no share button in the toolbar that exports the current `SpeziChat/Chat`. To add this element or change the export functionality, pass the desired export format for the `exportFormat` parameter in ``LLMChatView/init(session:exportFormat:)``.
#### Usage

An example usage of the ``LLMChatViewSchema`` can be seen in the following example.
19 changes: 13 additions & 6 deletions Sources/SpeziLLM/Views/LLMChatView.swift
Original file line number Diff line number Diff line change
@@ -13,7 +13,7 @@ import SwiftUI

/// Chat view that enables users to interact with an LLM based on an ``LLMSession``.
///
/// The ``LLMChatView`` takes an ``LLMSession`` instance as parameter within the ``LLMChatView/init(session:)``. The ``LLMSession`` is the executable version of the LLM containing context and state as defined by the ``LLMSchema``.
/// The ``LLMChatView`` takes an ``LLMSession`` instance and an optional `ChatView/ChatExportFormat` as parameters within the ``LLMChatView/init(session:exportFormat:)``. The ``LLMSession`` is the executable version of the LLM containing context and state as defined by the ``LLMSchema``.
///
/// The input can be either typed out via the iOS keyboard or provided as voice input and transcribed into written text.
///
@@ -28,6 +28,7 @@ import SwiftUI
/// The next code examples demonstrate how to use the ``LLMChatView`` with ``LLMSession``s.
///
/// The ``LLMChatView`` must be passed a ``LLMSession``, meaning a ready-to-use LLM, resulting in the need for the developer to manually allocate the ``LLMSession`` via the ``LLMRunner`` and ``LLMSchema`` (which includes state management).
/// The ``LLMChatView`` may also be passed a `ChatView/ChatExportFormat` to enable the chat export functionality and define the format of the to-be-exported `SpeziChat/Chat`; possible export formats are `.pdf`, `.text`, and `.json`.
///
/// In order to simplify the usage of an ``LLMSession``, SpeziLLM provides the ``LLMSessionProvider`` property wrapper that conveniently instantiates an ``LLMSchema`` to an ``LLMSession``.
/// The `@LLMSessionProvider` wrapper abstracts away the necessity to use the ``LLMRunner`` from the SwiftUI `Environment` within a `.task()` view modifier to instantiate the ``LLMSession``.
@@ -42,7 +43,7 @@ import SwiftUI
/// @State var muted = true
///
/// var body: some View {
/// LLMChatView(session: $llm)
/// LLMChatView(session: $llm, exportFormat: .pdf)
/// .speak(llm.context, muted: muted)
/// .speechToolbarButton(muted: $muted)
/// }
@@ -51,18 +52,19 @@ import SwiftUI
public struct LLMChatView<Session: LLMSession>: View {
/// The LLM in execution, as defined by the ``LLMSchema``.
@Binding private var llm: Session

/// Indicates if the input field is disabled.
@MainActor private var inputDisabled: Bool {
llm.state.representation == .processing
}
/// Defines the export format of the to-be-exported `SpeziChat/Chat`
private let exportFormat: ChatView.ChatExportFormat?


public var body: some View {
ChatView(
$llm.context,
disableInput: inputDisabled,
exportFormat: .pdf,
exportFormat: exportFormat,
messagePendingAnimation: .automatic
)
.viewStateAlert(state: llm.state)
@@ -92,9 +94,14 @@ public struct LLMChatView<Session: LLMSession>: View {
/// Creates a ``LLMChatView`` with a `Binding` of a ``LLMSession`` that provides developers with a basic chat view to interact with a Spezi LLM.
///
/// - Parameters:
/// - model: A `Binding` of a ``LLMSession`` that contains the ready-to-use LLM to generate outputs based on user input.
public init(session: Binding<Session>) {
/// - session: A `Binding` of a ``LLMSession`` that contains the ready-to-use LLM to generate outputs based on user input.
/// - exportFormat: An optional `ChatView/ChatExportFormat` to enable the chat export functionality and define the format of the to-be-exported `SpeziChat/Chat`.
public init(
session: Binding<Session>,
exportFormat: ChatView.ChatExportFormat? = nil
) {
self._llm = session
self.exportFormat = exportFormat
}
}

15 changes: 12 additions & 3 deletions Sources/SpeziLLM/Views/LLMChatViewSchema.swift
Original file line number Diff line number Diff line change
@@ -6,6 +6,7 @@
// SPDX-License-Identifier: MIT
//

import SpeziChat
import SwiftUI


@@ -32,21 +33,29 @@ import SwiftUI
/// }
/// }
/// ```
///
/// The ``LLMChatViewSchema`` may also be passed a `ChatView/ChatExportFormat` to enable the chat export functionality and define the format of the to-be-exported `SpeziChat/Chat`; possible export formats are `.pdf`, `.text`, and `.json`.
public struct LLMChatViewSchema<Schema: LLMSchema>: View {
@LLMSessionProvider<Schema> var llm: Schema.Platform.Session
private let exportFormat: ChatView.ChatExportFormat?


public var body: some View {
LLMChatView(session: $llm)
LLMChatView(session: $llm, exportFormat: exportFormat)
}


/// Creates a ``LLMChatViewSchema`` with an ``LLMSchema`` that provides developers with a basic chat view to interact with a Spezi LLM.
///
///
/// - Parameters:
/// - schema: The ``LLMSchema`` that defines the to-be-used LLM to generate outputs based on user input.
public init(with schema: Schema) {
/// - exportFormat: An optional `ChatView/ChatExportFormat` to enable the chat export functionality and define the format of the to-be-exported `SpeziChat/Chat`.
public init(
with schema: Schema,
exportFormat: ChatView.ChatExportFormat? = nil
) {
self._llm = LLMSessionProvider(schema: schema)
self.exportFormat = exportFormat
}
}

0 comments on commit dc37b91

Please sign in to comment.