@guardrails-ai/core / Exports / History / CallInputs
History.CallInputs
-
↳
CallInputs
- _args
- _fullSchemaReask
- _instructions
- _kwargs
- _llmOutput
- _metadata
- _msgHistory
- _numReasks
- _prompt
- _promptParams
- args
- fullSchemaReask
- instructions
- kwargs
- llmOutput
- metadata
- msgHistory
- numReasks
- prompt
- promptParams
• new CallInputs(llmOutput?
, instructions?
, prompt?
, msgHistory?
, promptParams?
, numReasks?
, metadata?
, fullSchemaReask?
, args?
, kwargs?
): CallInputs
Name | Type |
---|---|
llmOutput? |
string |
instructions? |
string |
prompt? |
string |
msgHistory? |
Dictionary [] |
promptParams? |
Dictionary |
numReasks? |
number |
metadata? |
Dictionary |
fullSchemaReask? |
boolean |
args? |
any [] |
kwargs? |
Dictionary |
• Private
Optional
_args: any
[]
Additional arguments for the LLM as provided by the user.",
• Protected
Optional
_fullSchemaReask: boolean
Whether to perform reasks across the entire schema or at the field level.
• Protected
Optional
_instructions: string
The instructions for chat model calls.
• Private
Optional
_kwargs: Dictionary
Additional keyword-arguments for the LLM as provided by the user.
• Protected
Optional
_llmOutput: string
The string output from an external LLM call provided by the user via Guard.parse.
• Protected
Optional
_metadata: Dictionary
The metadata provided by the user to be used during validation.
• Protected
Optional
_msgHistory: Dictionary
[]
The message history provided by the user for chat model calls.
• Protected
Optional
_numReasks: number
The total number of reasks allowed; user provided or defaulted.
• Protected
Optional
_prompt: string
The prompt.
• Protected
Optional
_promptParams: Dictionary
The parameters provided by the user that will be formatted into the final LLM prompt.
• get
args(): undefined
| any
[]
undefined
| any
[]
• get
fullSchemaReask(): undefined
| boolean
undefined
| boolean
Inputs.fullSchemaReask
• get
instructions(): undefined
| string
The instructions string as provided by the user.
undefined
| string
Inputs.instructions
• get
kwargs(): undefined
| Dictionary
undefined
| Dictionary
• get
llmOutput(): undefined
| string
undefined
| string
Inputs.llmOutput
• get
metadata(): undefined
| Dictionary
undefined
| Dictionary
Inputs.metadata
• get
msgHistory(): undefined
| Dictionary
[]
undefined
| Dictionary
[]
Inputs.msgHistory
• get
numReasks(): undefined
| number
undefined
| number
Inputs.numReasks
• get
prompt(): undefined
| string
The prompt string as provided by the user.
undefined
| string
Inputs.prompt
• get
promptParams(): undefined
| Dictionary
undefined
| Dictionary
Inputs.promptParams
▸ fromPyCallInputs(pyCallInputs
): Promise
<CallInputs
>
Name | Type |
---|---|
pyCallInputs |
any |
Promise
<CallInputs
>
▸ fromPyInputs(pyInputs
): Promise
<Inputs
>
Name | Type |
---|---|
pyInputs |
any |
Promise
<Inputs
>