Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SpeziLLMOpenAI: Replace MacPaw/OpenAI With Generated API Calls #64

Open
wants to merge 59 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
Show all changes
59 commits
Select commit Hold shift + click to select a range
0cfbf29
SpeziLLMOpenAI: openapi-generator infrastructure
paulhdk Dec 16, 2024
d340a10
SpeziLLMOpenAI: set "public" as the default access modifier generated
paulhdk Aug 27, 2024
ce640b4
SpeziLLMOpenAI: set a text/event-stream schema for the chat/completions
paulhdk Aug 27, 2024
6ac80c2
SpeziLLMOpenAI: add a ClientMiddleware that injects the API key into
paulhdk Aug 27, 2024
0b02090
SpeziLLMOpenAI: replace uses of MacPaw/OpenAI with generated API calls
paulhdk Dec 16, 2024
088b52d
UITests: update SpeziLLMOpenAI tests for generated API calls
paulhdk Aug 27, 2024
40acb59
LLMOpenAI: remove redundant marker
paulhdk Sep 10, 2024
cc31e35
LLMOpenAI: filter out "[DONE]" event in streamed responses
paulhdk Sep 12, 2024
a6035ef
LLMOpenAI: remove FIXIT re: Swiftformat bug
paulhdk Sep 12, 2024
d9411be
LLMOpenAI: address FIXMEs re: @Parameter
paulhdk Sep 12, 2024
52baf18
LLMOpenAI: Refactor setup() to reduce its length
paulhdk Sep 12, 2024
98c84fc
LLMOpenAI: remove redundant FIXME
paulhdk Sep 12, 2024
c6534a5
LLMOpenAI: fix error handling in LLMOpenAISession+Configuration.swift
paulhdk Sep 12, 2024
5b47220
LLMOpenAI: introduce LLMOpenAISession+ResponseHandler.swift for handling
paulhdk Sep 12, 2024
e64fb9b
LLMOpenAI: refactor getChatMessage() to reduce its length
paulhdk Sep 12, 2024
50db906
LLMOpenAI: reorder LLMOpenAISession+Configuration.swift to fix
paulhdk Sep 12, 2024
1a6cc12
LLMOpenAI: move logger into global scope
paulhdk Sep 13, 2024
6dd0f95
LLMOpenAI: fix error handling in FunctionCalling
paulhdk Sep 13, 2024
630d1dc
TestApp: remove redundant OpenAPI spec + generator config
paulhdk Dec 16, 2024
9cba800
LLMOpenAI: address some remaining Swiftlint warnings
paulhdk Sep 13, 2024
02f796e
LLMOpenAI: add licence header to LLMOpenAIAuthMiddleware.swift
paulhdk Sep 13, 2024
e14f3d3
LLMOpenAI: add license header to openapi.yaml
paulhdk Sep 13, 2024
5021c84
LLMOpenAI: add license header to openapi-generator-config.yaml
paulhdk Sep 13, 2024
e82370d
LLMOpenAI: comments
paulhdk Sep 13, 2024
6741f14
LLMOpenAI: fix "functionCalls" assignment
paulhdk Sep 13, 2024
823bf78
Use generated type for function calling
paulhdk Oct 2, 2024
0ffcfd6
Refactor error handling
paulhdk Oct 2, 2024
0c437cc
Pass SpeziLLM tests
paulhdk Oct 4, 2024
6c7a89a
Move `LLMFunctionParameterPropertySchema` and `LLMFunctionParameterIt…
paulhdk Oct 4, 2024
b1b95b5
Remove redundant closing brackets
paulhdk Oct 4, 2024
95c2979
Swiftlint
paulhdk Oct 4, 2024
0a789e6
Fix tests
paulhdk Oct 4, 2024
0ad1776
Remove `OpenAI` dependency from `SpeziLLMOpenAI` in `Package.swift`
paulhdk Oct 28, 2024
48c3aab
Rename local var in `LLMFunctionParameterSchemaCollector.swift`
paulhdk Oct 28, 2024
6471784
Introduce `LLMOpenAIRequestType` type alias
paulhdk Oct 28, 2024
91b3f8d
LLMOpenAI: refactor generation call to OpenAI API
paulhdk Nov 4, 2024
e39d202
LLMOPenAI: Remove redundant `OpenAI` package imports
paulhdk Nov 5, 2024
3a994ee
SpeziLLM: remove unnecessary `type` property in `LLMContextEntity::To…
paulhdk Dec 11, 2024
7fea2b4
SpeziLLMOpenAI: add `public` modifier to convenience init
paulhdk Dec 11, 2024
fd524b0
SpeziLLMOpenAI: replace use of `_LLMFunctionParameterWrapper`
paulhdk Dec 13, 2024
80e133f
SpeziLLMOpenAI: Replace deprecated `Servers.server1()`
paulhdk Dec 13, 2024
5a19c35
LLMOpenAIParameterTests: don't initialise `modelType`'s `value1`
paulhdk Dec 13, 2024
13b7bd2
SpeziLLMOpenAI: remove trailing whitespace
paulhdk Dec 13, 2024
16644d0
SpeziLLMTests: set `LLMOpenAISchema`'s `value1` correctly
paulhdk Dec 13, 2024
3ded7a0
fixup! SpeziLLMOpenAI: replace use of `_LLMFunctionParameterWrapper`
paulhdk Dec 16, 2024
2a5269b
Xcode: remove dangling references from project.pbxproj
paulhdk Dec 16, 2024
271b255
SpeziLLMOpenAI: update `swiftlint:disable` syntax in LLMFunctionParam…
paulhdk Dec 21, 2024
55897f8
SpeziLLMOpenAI: add docs for `AuthMiddleware`
paulhdk Dec 21, 2024
10fcb63
SpeziLLMOpenAI: Move global logger into local scope
paulhdk Dec 21, 2024
37b2b60
Update Sources/SpeziLLMOpenAI/LLMOpenAISession+Configuration.swift
paulhdk Dec 21, 2024
42f9079
swiftlint
paulhdk Dec 21, 2024
11d0d12
Merge branch 'main' into generate-api-calls
philippzagar Jan 19, 2025
ec00d56
First compiling draft
philippzagar Feb 15, 2025
b357d2d
Merge branch 'main' into generate-api-calls
philippzagar Feb 15, 2025
064d56c
First working draft
philippzagar Feb 16, 2025
118505b
Fix testing code and docs
philippzagar Feb 16, 2025
fb02c7f
Fix tests and project file changes
philippzagar Feb 16, 2025
edc7582
fix tests again
philippzagar Feb 16, 2025
dcad43f
fix tests again again
philippzagar Feb 16, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Use generated type for function calling
paulhdk committed Dec 16, 2024
commit 823bf7846f32e273dc8b6822da02548e40dd3b0c
Original file line number Diff line number Diff line change
@@ -7,7 +7,7 @@
//

import Foundation

import OpenAPIRuntime

paulhdk marked this conversation as resolved.
Show resolved Hide resolved
/// Represents an LLM function calling parameter.
///
@@ -43,5 +43,5 @@ import Foundation
/// }
/// ```
public protocol LLMFunctionParameter: Decodable {
static var schema: LLMFunctionParameterPropertySchema { get }
static var schema: Components.Schemas.FunctionParameters { get }
}
Original file line number Diff line number Diff line change
@@ -7,6 +7,7 @@
//

import Foundation
import OpenAPIRuntime


/// Represents an LLM function calling parameter in the form of an `array` element.
@@ -25,10 +26,16 @@ import Foundation
/// /// Manual conformance to `LLMFunctionParameterArrayElement` of a custom array item type.
/// struct CustomArrayItemType: LLMFunctionParameterArrayElement {
/// static let itemSchema: LLMFunctionParameterItemSchema = .init(
/// type: .object,
/// properties: [
/// "firstName": .init(type: .string, description: "The first name of the person"),
/// "lastName": .init(type: .string, description: "The last name of the person")
/// "type": "object",
/// "properties": [
/// "firstName": [
/// "type": "string",
/// "description": "The first name of the person")
/// ],
Comment on lines +31 to +34
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am wondering if we can add a nicely typed type for this instead of a dictionary; it can always map to a dictionary under the hood. Would be cool to avoid loosing that type-safe element?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Previously, SpeziLLMOpenAI wrapped around the Swift types provided by the OpenAI package, which would then eventually be passed to the API.
With the OpenAI OpenAPI spec, such types aren't generated, but the JSON schemas are instead validated for correctness as they're being encoded in the OpenAPIObjectContainer type.

Introducing such wrapper types again would require precise alignment with the OpenAI, which would make it, I could imagine, harder to maintain over time.
I could imagine that’s one reason why the official OpenAI Python package, which is also generated from the OpenAI OpenAPI specification, does not offer wrapper types either, AFAICT.

What do you think?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think adding an extension initializer/function that takes the well-typed arguments of one wants to use them would be beneficial and would avoid issues with string keys that are not correct or malformatted. Still allowing to pass in a dictionary might be an escape hatch that we can still provide. The OpenAPI surface is quite stable and if we use e.g. an enum for the type of the parameter can also have an other case with an associated string value.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I most definitely agree with @PSchmiedmayer here, I would follow the typical Swift paradigm and provide as much type safety as possible.
As mentioned by @PSchmiedmayer, I would implement well-typed inits / functions etc. that then map to the underlying String dictionary. And yes, an escape hatch that passes the raw dictionary might be beneficial!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to be clear and because I'm a little clueless about how to implement this easily: the only way I'm seeing here is to re-implement these two types from the MacPaw OpenAI package that SpeziLLM was using previously, and then reverting the changes around the initialisers accordingly.

/// Alias of the OpenAI `JSONSchema/Property` type, describing properties within an object schema.
public typealias LLMFunctionParameterPropertySchema = ChatQuery.ChatCompletionToolParam.FunctionDefinition.FunctionParameters.Property
/// Alias of the OpenAI `JSONSchema/Item` type, describing array items within an array schema.
public typealias LLMFunctionParameterItemSchema = ChatQuery.ChatCompletionToolParam.FunctionDefinition.FunctionParameters.Property.Items

I was not able to find a definitive documentation for the fileds that the OpenAI API accepts here, including the ones that are currently support by SpeziLLM, e.g., minItems, maxItems, uniqueItems.

The function calling documentation mentions none of them: https://platform.openai.com/docs/guides/function-calling

What do you think?

/// "lastName": [
/// "type": "string",
/// "description": "The last name of the person"
/// ]
/// ]
/// )
///
@@ -47,5 +54,5 @@ import Foundation
/// }
/// ```
public protocol LLMFunctionParameterArrayElement: Decodable {
static var itemSchema: LLMFunctionParameterItemSchema { get }
static var itemSchema: Components.Schemas.FunctionParameters { get }
}
Original file line number Diff line number Diff line change
@@ -13,7 +13,7 @@ import OpenAPIRuntime
///
/// Conformance of ``LLMFunction/Parameter`` to `LLMFunctionParameterSchemaCollector` can be found in the declaration of the ``LLMFunction/Parameter``.
protocol LLMFunctionParameterSchemaCollector {
var schema: LLMFunctionParameterPropertySchema { get }
var schema: Components.Schemas.FunctionParameters { get }
}


@@ -24,7 +24,7 @@ extension LLMFunction {
}

/// Aggregates the individual parameter schemas of all ``LLMFunction/Parameter``s and combines them into the complete parameter schema of the ``LLMFunction``.
var schema: LLMFunctionParameterSchema {
var schema: Components.Schemas.FunctionParameters {
let requiredPropertyNames = Array(
parameterValueCollectors
.filter {
@@ -35,7 +35,7 @@ extension LLMFunction {

let properties = schemaValueCollectors.compactMapValues { $0.schema }

var ret: LLMFunctionParameterSchema = .init()
var ret: Components.Schemas.FunctionParameters = .init()
do {
ret.additionalProperties = try .init(unvalidatedValue: [
"type": "object",
Original file line number Diff line number Diff line change
@@ -33,39 +33,26 @@ extension _LLMFunctionParameterWrapper where T: AnyArray, T.Element: BinaryInteg
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemsNoOpt: [String: any Sendable] = [
"type": "integer"
]
if let const = const.map({ String($0) }) {
itemsNoOpt["const"] = const
}
if let multipleOf {
itemsNoOpt["multipleOf"] = multipleOf
}
if let minimum {
itemsNoOpt["minimum"] = Double(minimum)
}
if let maximum {
itemsNoOpt["maximum"] = Double(maximum)
}
if itemsNoOpt.count > 1 {
addProp["items"] = itemsNoOpt
}
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "integer",
"const": const.map { String($0) } as Any?,
"multipleOf": multipleOf as Any?,
"minimum": minimum.map { Double($0) } as Any?,
"maximum": maximum.map { Double($0) } as Any?
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("LLMFunctionParameterWrapper+ArrayTypes")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably a more descriptive error message would be beneficial here.
In general, I think we shouldn't just ignore this error! (how does the current procedure play out down the road when OpenAI function calling parameters are decoded and injected into the Swift DSL?)
The downside is a potentially throwing init, which we definitely don't want here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once we've worked out #64 (comment), we can probably apply the same solution here, I hope.

self.init(description: "")
@@ -94,36 +81,25 @@ extension _LLMFunctionParameterWrapper where T: AnyArray, T.Element: BinaryFloat
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
"type": "array",
"description": String(description)
]
var itemsNoOpt: [String: any Sendable] = [
"type": "number"
]
if let const = const.map({ String($0) }) {
itemsNoOpt["const"] = const
}
if let minimum {
itemsNoOpt["minimum"] = Double(minimum)
}
if let maximum {
itemsNoOpt["maximum"] = Double(maximum)
}
if itemsNoOpt.count > 1 {
addProp["items"] = itemsNoOpt
}
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "number",
"description": String(description),
"items": [
"type": "number",
"const": const.map { String($0) } as Any?,
"minimum": minimum.map { Double($0) } as Any?,
"maximum": maximum.map { Double($0) } as Any?
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LMMFunctionParameter+ArrayTypes")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same feedback applies everywhere

self.init(description: "")
@@ -148,30 +124,23 @@ extension _LLMFunctionParameterWrapper where T: AnyArray, T.Element == Bool {
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemsNoOpt: [String: any Sendable] = [
"type": "boolean"
]
if let const = const.map({ String($0) }) {
itemsNoOpt["const"] = const
}
if itemsNoOpt.count > 1 {
addProp["items"] = itemsNoOpt
}
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "boolean",
"const": const.map { String($0) } as Any?
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+ArrayTypes")
self.init(description: "")
@@ -200,36 +169,25 @@ extension _LLMFunctionParameterWrapper where T: AnyArray, T.Element: StringProto
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemsNoOpt: [String: any Sendable] = [
"type": "string"
]
if let pattern = pattern.map({ String($0) }) {
itemsNoOpt["pattern"] = pattern
}
if let const = const.map({ String($0) }) {
itemsNoOpt["const"] = const
}
if let `enum` = `enum`.map({ $0.map { String($0) } }) {
itemsNoOpt["const"] = `enum`
}
if itemsNoOpt.count > 1 {
addProp["items"] = itemsNoOpt
}
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "string",
"pattern": pattern.map { String($0) } as Any?,
"const": const.map { String($0) } as Any?,
"enum": `enum`.map { $0.map { String($0) } } as Any?
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+ArrayTypes")
self.init(description: "")
Original file line number Diff line number Diff line change
@@ -26,144 +26,39 @@ extension _LLMFunctionParameterWrapper where T: AnyArray, T.Element: LLMFunction
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
let itemSchema = T.Element.itemSchema.additionalProperties.value
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemNonOpt: [String: any Sendable] = [
"type": T.Element.itemSchema.type.rawValue
]
if let properties = T.Element.itemSchema.properties?.mapValues({ $0.toDict() }) {
itemNonOpt["properties"] = properties
}
if let pattern = T.Element.itemSchema.pattern {
itemNonOpt["pattern"] = pattern
}
if let const = T.Element.itemSchema.const {
itemNonOpt["const"] = const
}
if let `enum` = T.Element.itemSchema.enum {
itemNonOpt["enum"] = `enum`
}
if let multipleOf = T.Element.itemSchema.multipleOf {
itemNonOpt["multipleOf"] = multipleOf
}
if let minimum = T.Element.itemSchema.minimum {
itemNonOpt["minimum"] = minimum
}
if let maximum = T.Element.itemSchema.maximum {
itemNonOpt["maximum"] = maximum
}
addProp["items"] = itemNonOpt
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": itemSchema["type"],
"properties": itemSchema["properties"],
"pattern": itemSchema["pattern"],
"const": itemSchema["const"],
"enum": itemSchema["enum"],
"multipleOf": itemSchema["multipleOf"],
"minimum": itemSchema["minimum"],
"maximum": itemSchema["maximum"]
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("Couldn't create FunctionParameterWrapper+CustomType \(error)")
self.init(description: "")
}
}
}

// FIXME: This should probably be made redundant as part of bigger simplification for initialising the wrappers
extension ChatQuery.ChatCompletionToolParam.FunctionDefinition.FunctionParameters.Property {
public func toDict() -> [String: any Sendable] {
var res: [String: any Sendable] = [
"type": Self.JSONType.string.rawValue
]
if let description {
res["description"] = description
}
if let format {
res["format"] = format
}
if let items {
res["items"] = items.toDict()
}
if let required {
res["required"] = required
}
if let pattern {
res["pattern"] = pattern
}
if let const {
res["const"] = const
}
if let `enum` {
res["enum"] = `enum`
}
if let multipleOf {
res["multipleOf"] = multipleOf
}
if let minimum {
res["minimum"] = minimum
}
if let maximum {
res["maximum"] = maximum
}
if let minItems {
res["minItems"] = minItems
}
if let maxItems {
res["maxItems"] = maxItems
}
if let uniqueItems {
res["uniqueItems"] = uniqueItems
}
return res
}
}

// FIXME: This should probably be made redundant as part of bigger simplification for initialising the wrappers
extension ChatQuery.ChatCompletionToolParam.FunctionDefinition.FunctionParameters.Property.Items {
public func toDict() -> [String: any Sendable] {
var res: [String: any Sendable] = [
"type": Self.JSONType.string.rawValue
]
if let properties = properties?.mapValues({ $0.toDict() }) {
res["properties"] = properties
}
if let pattern {
res["pattern"] = pattern
}
if let const {
res["const"] = const
}
if let `enum` {
res["enum"] = `enum`
}
if let multipleOf {
res["multipleOf"] = multipleOf
}
if let minimum {
res["minimum"] = minimum
}
if let maximum {
res["maximum"] = maximum
}
if let minItems {
res["minItems"] = minItems
}
if let maxItems {
res["maxItems"] = maxItems
}
if let uniqueItems {
res["uniqueItems"] = uniqueItems
}
return res
}
}

extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: AnyArray,
T.Wrapped.Element: LLMFunctionParameterArrayElement {
T.Wrapped.Element: LLMFunctionParameterArrayElement {
/// Declares an optional ``LLMFunctionParameterArrayElement``-based (custom type) ``LLMFunction/Parameter`` `array`.
///
/// - Parameters:
@@ -178,47 +73,30 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: AnyArray
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
let itemSchema = T.Wrapped.Element.itemSchema.additionalProperties.value
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemNonOpt: [String: any Sendable] = [
"type": T.Wrapped.Element.itemSchema.type.rawValue
]

if let properties = T.Wrapped.Element.itemSchema.properties?.mapValues({ $0.toDict() }) {
itemNonOpt["properties"] = properties
}
if let pattern = T.Wrapped.Element.itemSchema.pattern {
itemNonOpt["pattern"] = pattern
}
if let const = T.Wrapped.Element.itemSchema.const {
itemNonOpt["const"] = const
}
if let `enum` = T.Wrapped.Element.itemSchema.enum {
itemNonOpt["enum"] = `enum`
}
if let multipleOf = T.Wrapped.Element.itemSchema.multipleOf {
itemNonOpt["multipleOf"] = multipleOf
}
if let minimum = T.Wrapped.Element.itemSchema.minimum {
itemNonOpt["minimum"] = minimum
}
if let maximum = T.Wrapped.Element.itemSchema.maximum {
itemNonOpt["maximum"] = maximum
}
addProp["items"] = itemNonOpt
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": itemSchema["type"],
"properties": itemSchema["properties"],
"pattern": itemSchema["pattern"],
"const": itemSchema["const"],
"enum": itemSchema["enum"],
"multipleOf": itemSchema["multipleOf"],
"minimum": itemSchema["minimum"],
"maximum": itemSchema["maximum"]
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("Couldn't create LLMFunctionParameterWrapper+CustomTypes")
self.init(description: "")
Original file line number Diff line number Diff line change
@@ -11,7 +11,8 @@ import SpeziFoundation
// swiftlint:disable discouraged_optional_boolean

extension _LLMFunctionParameterWrapper where T: LLMFunctionParameterEnum, T.RawValue: StringProtocol {
/// Declares an `enum`-based ``LLMFunction/Parameter`` defining all options of a text-based parameter of the ``LLMFunction``.
/// Declares an `enum`-based ``LLMFunction/Parameter`` defining all options of a text-based parameter of the
/// ``LLMFunction``.
///
/// - Parameters:
/// - description: Describes the purpose of the parameter, used by the LLM to grasp the purpose of the parameter.
@@ -21,26 +22,26 @@ extension _LLMFunctionParameterWrapper where T: LLMFunctionParameterEnum, T.RawV
const: (any StringProtocol)? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "string",
"description": String(description)
]
if let const {
addProp["const"] = String(const)
}
addProp["enum"] = T.allCases.map { String($0.rawValue) }
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"const": const.map { String($0) } as Any?,
"enum": T.allCases.map { String($0.rawValue) }
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+Enum")
logger
.error(
"SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+Enum \(error.localizedDescription)"
)
self.init(description: "")
}
}
}

extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: LLMFunctionParameterEnum,
T.Wrapped.RawValue: StringProtocol {
/// Declares an optional `enum`-based ``LLMFunction/Parameter`` defining all options of a text-based parameter of the ``LLMFunction``.
/// Declares an optional `enum`-based ``LLMFunction/Parameter`` defining all options of a text-based parameter of
/// the ``LLMFunction``.
///
/// - Parameters:
/// - description: Describes the purpose of the parameter, used by the LLM to grasp the purpose of the parameter.
@@ -50,26 +51,26 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: LLMFunct
const: (any StringProtocol)? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "string",
"description": String(description)
]
if let const {
addProp["const"] = String(const)
}
addProp["enum"] = T.Wrapped.allCases.map { String($0.rawValue) }
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"const": const.map { String($0) } as Any?,
"enum": T.Wrapped.allCases.map { String($0.rawValue) }
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+Enum")
logger
.error(
"SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+Enum \(error.localizedDescription)"
)
self.init(description: "")
}
}
}

extension _LLMFunctionParameterWrapper where T: AnyArray, T.Element: LLMFunctionParameterEnum,
T.Element.RawValue: StringProtocol {
/// Declares an `enum`-based ``LLMFunction/Parameter`` `array`. An individual `array` element defines all options of a text-based parameter of the ``LLMFunction``.
/// Declares an `enum`-based ``LLMFunction/Parameter`` `array`. An individual `array` element defines all options of
/// a text-based parameter of the ``LLMFunction``.
///
/// - Parameters:
/// - description: Describes the purpose of the parameter, used by the LLM to grasp the purpose of the parameter.
@@ -85,41 +86,40 @@ extension _LLMFunctionParameterWrapper where T: AnyArray, T.Element: LLMFunction
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemNonOpt: [String: any Sendable] = [
"type": "string"
]
if let const {
itemNonOpt["const"] = String(const)
}
itemNonOpt["enum"] = T.Element.allCases.map { String($0.rawValue) }
addProp["items"] = itemNonOpt
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "string",
"const": const.map { String($0) } as Any?,
"enum": T.Element.allCases.map { String($0.rawValue) }
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+Enum")
logger
.error(
"SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+Enum \(error.localizedDescription)"
)
self.init(description: "")
}
}
}

extension _LLMFunctionParameterWrapper where T: AnyOptional,
T.Wrapped: AnyArray,
T.Wrapped.Element: LLMFunctionParameterEnum,
T.Wrapped.Element.RawValue: StringProtocol {
/// Declares an optional `enum`-based ``LLMFunction/Parameter`` `array`. An individual `array` element defines all options of a text-based parameter of the ``LLMFunction``.
T.Wrapped: AnyArray,
T.Wrapped.Element: LLMFunctionParameterEnum,
T.Wrapped.Element.RawValue: StringProtocol {
/// Declares an optional `enum`-based ``LLMFunction/Parameter`` `array`. An individual `array` element defines all
/// options of a text-based parameter of the ``LLMFunction``.
///
/// - Parameters:
/// - description: Describes the purpose of the parameter, used by the LLM to grasp the purpose of the parameter.
@@ -135,31 +135,29 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional,
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemNonOpt: [String: any Sendable] = [
"type": "string"
]
if let const {
itemNonOpt["const"] = String(const)
}
itemNonOpt["enum"] = T.Wrapped.Element.allCases.map { String($0.rawValue) }
addProp["items"] = itemNonOpt
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "string",
"const": const.map { String($0) } as Any?,
"enum": T.Wrapped.Element.allCases.map { String($0.rawValue) }
],
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+Enum")
logger
.error(
"SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+Enum \(error.localizedDescription)"
)
self.init(description: "")
}
}
Original file line number Diff line number Diff line change
@@ -27,24 +27,14 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: BinaryIn
maximum: T.Wrapped? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "integer",
"description": String(description)
]
if let const {
addProp["const"] = String(const)
}
if let multipleOf {
addProp["multipleOf "] = multipleOf
}
if let minimum {
addProp["minimum"] = Double(minimum)
}
if let maximum {
addProp["maximum"] = Double(maximum)
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"const": const.map { String($0) } as Any?,
"multipleOf": multipleOf as Any?,
"minimum": minimum.map { Double($0) } as Any?,
"maximum": maximum.map { Double($0) } as Any?
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionparaemter+OptionalType")
self.init(description: "")
@@ -67,21 +57,13 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: BinaryFl
maximum: T.Wrapped? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "number",
"description": String(description)
]
if let const {
addProp["const"] = String(const)
}
if let minimum {
addProp["minimum"] = Double(minimum)
}
if let maximum {
addProp["maximum"] = Double(maximum)
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"const": const.map { String($0) } as Any?,
"minimum": minimum.map { Double($0) } as Any?,
"maximum": maximum.map { Double($0) } as Any?
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+OptionalType")
self.init(description: "")
@@ -100,15 +82,11 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped == Bool {
const: (any StringProtocol)? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "boolean",
"description": String(description)
]
if let const {
addProp["const"] = String(const)
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"const": const.map { String($0) } as Any?
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionalParameterWrapper+OptionalTypes")
self.init(description: "")
@@ -133,24 +111,14 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: StringPr
enum: [any StringProtocol]? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "string",
"description": String(description)
]
if let format {
addProp["format"] = format.rawValue
}
if let pattern {
addProp["pattern"] = String(pattern)
}
if let const {
addProp["const"] = String(const)
}
if let `enum` {
addProp["enum"] = `enum`.map { $0.map { String($0) } }
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"format": format?.rawValue as Any?,
"pattern": pattern.map { String($0) } as Any?,
"const": const.map { String($0) } as Any?,
"enum": `enum`.map { $0.map { String($0) as Any? } }
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+OptionalTypes")
self.init(description: "")
@@ -179,39 +147,29 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: AnyArray
maximum: T.Wrapped.Element? = nil,
minItems: Int? = nil,
maxItems: Int? = nil,
uniqueItems _: Bool? = nil
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemNonOpt: [String: any Sendable] = [
"type": "integer"
]
if let const {
itemNonOpt["const"] = String(const)
}
if let multipleOf {
itemNonOpt["multipleOf"] = String(multipleOf)
}
if let minimum {
itemNonOpt["minimum"] = Double(minimum)
}
if let maximum {
itemNonOpt["maximum"] = Double(maximum)
}
if itemNonOpt.count > 1 {
addProp["items"] = itemNonOpt
}
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "integer",
"const": const.map { String($0) } as Any?,
"multipleOf": multipleOf.map { Int($0) } as Any?,
"minimum": minimum.map { Double($0) } as Any?,
"maximum": maximum.map { Double($0) } as Any?
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionPropertyWrapper+OptionalType")
self.init(description: "")
@@ -241,36 +199,25 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: AnyArray
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemNonOpt: [String: any Sendable] = [
"type": "number"
]
if let const {
itemNonOpt["const"] = String(const)
}
if let minimum {
itemNonOpt["minimum"] = Double(minimum)
}
if let maximum {
itemNonOpt["maximum"] = Double(maximum)
}
if itemNonOpt.count > 1 {
addProp["items"] = itemNonOpt
}
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "number",
"const": const.map { String($0) } as Any?,
"minimum": minimum.map { Double($0) } as Any?,
"maximum": maximum.map { Double($0) } as Any?
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+OptionalTypes")
self.init(description: "")
@@ -295,30 +242,23 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: AnyArray
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemNonOpt: [String: any Sendable] = [
"type": "boolean"
]
if let const {
itemNonOpt["const"] = String(const)
}
if itemNonOpt.count > 1 {
addProp["items"] = itemNonOpt
}
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "boolean",
"const": const.map { String($0) } as Any?
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+OptionalTypes.swift")
self.init(description: "")
@@ -348,36 +288,25 @@ extension _LLMFunctionParameterWrapper where T: AnyOptional, T.Wrapped: AnyArray
uniqueItems: Bool? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "array",
"description": String(description)
]
var itemNonOpt: [String: any Sendable] = [
"type": "string"
]
if let pattern {
itemNonOpt["pattern"] = String(pattern)
}
if let const {
itemNonOpt["const"] = String(const)
}
if let `enum` {
itemNonOpt["enum"] = `enum`.map { $0.map { String($0) } }
}
if itemNonOpt.count > 1 {
addProp["items"] = itemNonOpt
}
if let minItems {
addProp["minItems"] = minItems
}
if let maxItems {
addProp["maxItems"] = maxItems
}
if let uniqueItems {
addProp["uniqueItems"] = uniqueItems
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"items": [
"type": "string",
"pattern": pattern.map { String($0) } as Any?,
"const": const.map { String($0) } as Any?,
"enum": `enum`.map { $0.map { String($0) } } as Any?
].compactMapValues { $0 },
"minItems": minItems as Any?,
"maxItems": maxItems as Any?,
"uniqueItems": uniqueItems as Any?
].compactMapValues { $0 }
.filter { _, value in if let dict = value as? [String: Any] {
dict.count > 1
} else {
true
}
})))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+OptionalType")
self.init(description: "")
Original file line number Diff line number Diff line change
@@ -25,34 +25,25 @@ extension _LLMFunctionParameterWrapper where T: BinaryInteger {
maximum: T? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
"type": "integer",
"description": String(description)
]
if let const {
addProp["const"] = const.map { String($0) }
}
if let multipleOf {
addProp["multipleOf"] = multipleOf
}
if let minimum {
addProp["minimum"] = Double(minimum)
}
if let maximum {
addProp["maximum"] = Double(maximum)
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
try self.init(schema: .init(additionalProperties:
.init(unvalidatedValue: [
"type": "integer",
"description": String(description),
"const": const.map { String($0) } as Any?,
"multipleOf": multipleOf as Any?,
"minimum": minimum.map { Double($0) } as Any?,
"maximum": maximum.map { Double($0) } as Any?
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameter+PrimitveTypes")
self.init(description: "")
}
}
}


extension _LLMFunctionParameterWrapper where T: BinaryFloatingPoint {
/// Declares an ``LLMFunction/Parameter`` of the type `Float` or `Double` (`BinaryFloatingPoint`) defining a floating-point parameter of the ``LLMFunction``.
/// Declares an ``LLMFunction/Parameter`` of the type `Float` or `Double` (`BinaryFloatingPoint`) defining a
/// floating-point parameter of the ``LLMFunction``.
///
/// - Parameters:
/// - description: Describes the purpose of the parameter, used by the LLM to grasp the purpose of the parameter.
@@ -66,21 +57,13 @@ extension _LLMFunctionParameterWrapper where T: BinaryFloatingPoint {
maximum: T? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "number",
"description": String(description)
]
if let const {
addProp["const"] = String(const)
}
if let minimum {
addProp["minimum"] = Double(minimum)
}
if let maximum {
addProp["maximum"] = Double(maximum)
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"const": const.map { String($0) } as Any?,
"minimum": minimum.map { Double($0) } as Any?,
"maximum": maximum.map { Double($0) } as Any?
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+PrimitveTypes")
self.init(description: "")
@@ -99,15 +82,12 @@ extension _LLMFunctionParameterWrapper where T == Bool {
const: (any StringProtocol)? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue:
[
"type": "boolean",
"description": String(description)
]
if let const {
addProp["const"] = String(const)
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"const": const.map { String($0) } as Any?
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+PrimiteveTypes")
self.init(description: "")
@@ -116,11 +96,13 @@ extension _LLMFunctionParameterWrapper where T == Bool {
}

extension _LLMFunctionParameterWrapper where T: StringProtocol {
/// Declares an ``LLMFunction/Parameter`` of the type `String` defining a text-based parameter of the ``LLMFunction``.
/// Declares an ``LLMFunction/Parameter`` of the type `String` defining a text-based parameter of the
/// ``LLMFunction``.
///
/// - Parameters:
/// - description: Describes the purpose of the parameter, used by the LLM to grasp the purpose of the parameter.
/// - format: Defines a required format of the parameter, allowing interoperable semantic validation of the value.
/// - format: Defines a required format of the parameter, allowing interoperable semantic validation of the
/// value.
/// - pattern: A Regular Expression that the parameter needs to conform to.
/// - const: Specifies the constant `String`-based value of a certain parameter.
/// - enum: Defines all cases of the `String` parameter.
@@ -132,24 +114,14 @@ extension _LLMFunctionParameterWrapper where T: StringProtocol {
enum: [any StringProtocol]? = nil
) {
do {
// FIXME: How can this be simplified?
var addProp: [String: any Sendable] = [
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: [
"type": "string",
"description": String(description)
]
if let const {
addProp["const"] = String(const)
}
if let format {
addProp["format"] = format.rawValue
}
if let pattern {
addProp["pattern"] = String(pattern)
}
if let `enum` {
addProp["enum"] = `enum`.map { $0.map { String($0) } }
}
try self.init(schema: .init(additionalProperties: .init(unvalidatedValue: addProp)))
"description": String(description),
"format": format?.rawValue as Any?,
"pattern": pattern.map { String($0) } as Any?,
"const": const.map { String($0) } as Any?,
"enum": `enum`.map { $0.map { String($0) } } as Any?
].compactMapValues { $0 })))
} catch {
logger.error("SpeziLLMOpenAI - initialization error - LLMFunctionParameterWrapper+PrimitiveTypes")
self.init(description: "")
Original file line number Diff line number Diff line change
@@ -9,23 +9,16 @@
import OpenAI
import OpenAPIRuntime

/// Alias of the OpenAI `JSONSchema/Property` type, describing properties within an object schema.
public typealias LLMFunctionParameterPropertySchema = Components.Schemas.FunctionParameters
// FIXME: LLMFunctionParameterItemSchema does not use a generated type yet

/// Alias of the OpenAI `JSONSchema/Item` type, describing array items within an array schema.
public typealias LLMFunctionParameterItemSchema = ChatQuery.ChatCompletionToolParam.FunctionDefinition
.FunctionParameters.Property.Items

// swiftlint:disable type_name
/// Refer to the documentation of ``LLMFunction/Parameter`` for information on how to use the `@Parameter` property wrapper.
@propertyWrapper
public class _LLMFunctionParameterWrapper<T: Decodable>: LLMFunctionParameterSchemaCollector {
// swiftlint:enable type_name
private var injectedValue: T?
var schema: LLMFunctionParameterPropertySchema


var schema: Components.Schemas.FunctionParameters
public var wrappedValue: T {
// If the unwrapped injectedValue is not nil, return the non-nil value
if let value = injectedValue {
@@ -58,7 +51,7 @@ public class _LLMFunctionParameterWrapper<T: Decodable>: LLMFunctionParameterSch
self.init(schema: T.schema)
}

init(schema: LLMFunctionParameterPropertySchema) {
init(schema: Components.Schemas.FunctionParameters ) {
self.schema = schema
}

@@ -97,7 +90,3 @@ extension LLMFunction {
public typealias Parameter<WrappedValue> =
_LLMFunctionParameterWrapper<WrappedValue> where WrappedValue: Decodable
}


/// Ensuring `Sendable` conformances of ``LLMFunctionParameterPropertySchema`` and ``LLMFunctionParameterItemSchema``
extension LLMFunctionParameterItemSchema: @unchecked Sendable {}
Original file line number Diff line number Diff line change
@@ -6,20 +6,32 @@
// SPDX-License-Identifier: MIT
//

import OpenAPIRuntime
import SpeziLLMOpenAI


struct LLMOpenAIFunctionPerson: LLMFunction {
struct CustomArrayItemType: LLMFunctionParameterArrayElement {
static let itemSchema: LLMFunctionParameterItemSchema = .init(
type: .object,
properties: [
"firstName": .init(type: .string, description: "The first name of the person"),
"lastName": .init(type: .string, description: "The last name of the person")
]
)


static let itemSchema: Components.Schemas.FunctionParameters = {
do {
return try Components.Schemas.FunctionParameters(additionalProperties: .init(unvalidatedValue: [
"type": "object",
"properties": [
"firstName": [
"type": "string",
"description": "The first name of the person"
],
"lastName": [
"type": "string",
"description": "The last name of the person"
]
]
]))
} catch {
fatalError("Couldn't create function parameters in for testing")
}
}()

let firstName: String
let lastName: String
}