-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[editor] Model Selector #662
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,7 +1,7 @@ | ||
import PromptContainer from "./prompt/PromptContainer"; | ||
import { Container, Group, Button, createStyles, Stack } from "@mantine/core"; | ||
import { showNotification } from "@mantine/notifications"; | ||
import { AIConfig, Prompt, PromptInput } from "aiconfig"; | ||
import { AIConfig, ModelMetadata, Prompt, PromptInput } from "aiconfig"; | ||
import { useCallback, useMemo, useReducer, useRef, useState } from "react"; | ||
import aiconfigReducer, { AIConfigReducerAction } from "./aiconfigReducer"; | ||
import { | ||
|
@@ -28,6 +28,10 @@ export type AIConfigCallbacks = { | |
getModels: (search: string) => Promise<string[]>; | ||
runPrompt: (promptName: string) => Promise<void>; | ||
save: (aiconfig: AIConfig) => Promise<void>; | ||
updateModel: ( | ||
promptName?: string, | ||
modelData?: string | ModelMetadata | ||
) => Promise<void /*{ aiconfig: AIConfig }*/>; | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why not return aiconfig? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It will once the endpoint is implemented -- right now there's no updated aiconfig to return. |
||
updatePrompt: ( | ||
promptName: string, | ||
promptData: Prompt | ||
|
@@ -150,18 +154,63 @@ export default function EditorContainer({ | |
[dispatch] | ||
); | ||
|
||
const debouncedUpdateModel = useMemo( | ||
() => | ||
debounce( | ||
(promptName?: string, modelMetadata?: string | ModelMetadata) => | ||
callbacks.updateModel(promptName, modelMetadata), | ||
250 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Should we define this as a const somewhere? I see it's also 250 for There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I can pull it out to a const for now in a separate PR, but it could technically be dependent on where it's used |
||
), | ||
[callbacks.updateModel] | ||
); | ||
|
||
const onUpdatePromptModelSettings = useCallback( | ||
async (promptIndex: number, newModelSettings: any) => { | ||
dispatch({ | ||
type: "UPDATE_PROMPT_MODEL_SETTINGS", | ||
index: promptIndex, | ||
modelSettings: newModelSettings, | ||
}); | ||
// TODO: Call server-side endpoint to update model settings | ||
// TODO: Call server-side endpoint to update model | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Not setting this up just yet (unlike model name below) since there is a higher potential for bugs in managing the settings object & don't want to add friction to whomever is adding the server-side endpoint. I'll complete this once the endpoint is implemented |
||
}, | ||
[dispatch] | ||
); | ||
|
||
const onUpdatePromptModel = useCallback( | ||
async (promptIndex: number, newModel?: string) => { | ||
dispatch({ | ||
type: "UPDATE_PROMPT_MODEL", | ||
index: promptIndex, | ||
modelName: newModel, | ||
}); | ||
|
||
try { | ||
const prompt = clientPromptToAIConfigPrompt( | ||
aiconfigState.prompts[promptIndex] | ||
); | ||
const currentModel = prompt.metadata?.model; | ||
let modelData: string | ModelMetadata | undefined = newModel; | ||
if (newModel && currentModel && typeof currentModel !== "string") { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think with the new SDK you won't have to do individual checks because we're passing There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ah, ok - for the updated config it seems like we can't support the following scenario: On top of that, if we change the model and it did already have settings set, we would ideally want to maintain only the settings that are supported by the new model (e.g. say temperature, topK are used by both models) as defined in the prompt schema that is associated with the model provider, and drop anything not supported. This might need to wait until post-MVP, though. For now we can either clear the settings or keep them entirely... I was doing the former here, but open to opinions |
||
modelData = { | ||
...currentModel, | ||
name: newModel, | ||
}; | ||
} | ||
|
||
await debouncedUpdateModel(prompt.name, modelData); | ||
|
||
// TODO: Consolidate | ||
} catch (err: any) { | ||
showNotification({ | ||
title: "Error updating prompt model", | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Would it be possible to read the http message that we return and we display it? Can be done future PR |
||
message: err.message, | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Oh I guess it's done here? |
||
color: "red", | ||
}); | ||
} | ||
}, | ||
[dispatch, debouncedUpdateModel] | ||
); | ||
|
||
const onUpdatePromptParameters = useCallback( | ||
async (promptIndex: number, newParameters: any) => { | ||
dispatch({ | ||
|
@@ -241,8 +290,6 @@ export default function EditorContainer({ | |
|
||
const { classes } = useStyles(); | ||
|
||
// TODO: Implement editor context for callbacks, readonly state, etc. | ||
|
||
return ( | ||
<> | ||
<Container maw="80rem"> | ||
|
@@ -262,9 +309,11 @@ export default function EditorContainer({ | |
<PromptContainer | ||
index={i} | ||
prompt={prompt} | ||
getModels={callbacks.getModels} | ||
onChangePromptInput={onChangePromptInput} | ||
onChangePromptName={onChangePromptName} | ||
onRunPrompt={onRunPrompt} | ||
onUpdateModel={onUpdatePromptModel} | ||
onUpdateModelSettings={onUpdatePromptModelSettings} | ||
Comment on lines
+316
to
317
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Oh ok cool, so we are keeping them as separate actions? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ya, we should keep them separate since there's different logic for them:
|
||
onUpdateParameters={onUpdatePromptParameters} | ||
defaultConfigModelName={aiconfigState.metadata.default_model} | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -10,6 +10,7 @@ export type MutateAIConfigAction = | |
| AddPromptAction | ||
| UpdatePromptInputAction | ||
| UpdatePromptNameAction | ||
| UpdatePromptModelAction | ||
| UpdatePromptModelSettingsAction | ||
| UpdatePromptParametersAction; | ||
|
||
|
@@ -37,6 +38,12 @@ export type UpdatePromptNameAction = { | |
name: string; | ||
}; | ||
|
||
export type UpdatePromptModelAction = { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is this being used anywhere in this PR? Does the displatch type being set to There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. When we dispatch the "UPDATE_PROMPT_MODEL" action, that action is implicitly typed as this
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Nice, thanks for the explanation! |
||
type: "UPDATE_PROMPT_MODEL"; | ||
index: number; | ||
modelName?: string; | ||
}; | ||
|
||
export type UpdatePromptModelSettingsAction = { | ||
type: "UPDATE_PROMPT_MODEL_SETTINGS"; | ||
index: number; | ||
|
@@ -134,6 +141,20 @@ export default function aiconfigReducer( | |
name: action.name, | ||
})); | ||
} | ||
case "UPDATE_PROMPT_MODEL": { | ||
return reduceReplacePrompt(state, action.index, (prompt) => ({ | ||
...prompt, | ||
metadata: { | ||
...prompt.metadata, | ||
model: action.modelName | ||
? { | ||
name: action.modelName, | ||
// TODO: Consolidate settings based on schema union | ||
} | ||
: undefined, | ||
Comment on lines
+149
to
+154
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hopefully don't need to do this logic with updated SDK There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'll revisit this once the endpoint is implemented. We still need some UI state change here (this happens before the request is sent and before the response is returned to ensure responsive UI). Probably what will need to be done is:
|
||
}, | ||
})); | ||
} | ||
case "UPDATE_PROMPT_MODEL_SETTINGS": { | ||
return reduceReplacePrompt(state, action.index, (prompt) => ({ | ||
...prompt, | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,7 +1,7 @@ | ||
import { ActionIcon, Menu, TextInput } from "@mantine/core"; | ||
import { IconPlus, IconSearch, IconTextCaption } from "@tabler/icons-react"; | ||
import { memo, useCallback, useEffect, useState } from "react"; | ||
import { showNotification } from "@mantine/notifications"; | ||
import { memo, useCallback, useState } from "react"; | ||
import useLoadModels from "../../hooks/useLoadModels"; | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Nice, much cleaner |
||
|
||
type Props = { | ||
addPrompt: (prompt: string) => void; | ||
|
@@ -41,30 +41,14 @@ function ModelMenuItems({ | |
|
||
export default memo(function AddPromptButton({ addPrompt, getModels }: Props) { | ||
const [modelSearch, setModelSearch] = useState(""); | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can you clarify what There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There's a search input and |
||
const [models, setModels] = useState<string[]>([]); | ||
const [isOpen, setIsOpen] = useState(false); | ||
|
||
const loadModels = useCallback(async (modelSearch: string) => { | ||
try { | ||
const models = await getModels(modelSearch); | ||
setModels(models); | ||
} catch (err: any) { | ||
showNotification({ | ||
title: "Error loading models", | ||
message: err?.message, | ||
color: "red", | ||
}); | ||
} | ||
}, []); | ||
|
||
const onAddPrompt = useCallback((model: string) => { | ||
addPrompt(model); | ||
setIsOpen(false); | ||
}, []); | ||
|
||
useEffect(() => { | ||
loadModels(modelSearch); | ||
}, [loadModels, modelSearch]); | ||
const models = useLoadModels(modelSearch, getModels); | ||
|
||
return ( | ||
<Menu | ||
|
Original file line number | Diff line number | Diff line change | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
@@ -0,0 +1,82 @@ | ||||||||||||||||||||||
import { Autocomplete, AutocompleteItem, Button } from "@mantine/core"; | ||||||||||||||||||||||
import { memo, useState } from "react"; | ||||||||||||||||||||||
import { getPromptModelName } from "../../utils/promptUtils"; | ||||||||||||||||||||||
import { Prompt } from "aiconfig"; | ||||||||||||||||||||||
import useLoadModels from "../../hooks/useLoadModels"; | ||||||||||||||||||||||
import { IconX } from "@tabler/icons-react"; | ||||||||||||||||||||||
|
||||||||||||||||||||||
type Props = { | ||||||||||||||||||||||
prompt: Prompt; | ||||||||||||||||||||||
getModels: (search: string) => Promise<string[]>; | ||||||||||||||||||||||
onSetModel: (model?: string) => void; | ||||||||||||||||||||||
defaultConfigModelName?: string; | ||||||||||||||||||||||
}; | ||||||||||||||||||||||
|
||||||||||||||||||||||
export default memo(function ModelSelector({ | ||||||||||||||||||||||
prompt, | ||||||||||||||||||||||
getModels, | ||||||||||||||||||||||
onSetModel, | ||||||||||||||||||||||
defaultConfigModelName, | ||||||||||||||||||||||
}: Props) { | ||||||||||||||||||||||
const [selectedModel, setSelectedModel] = useState<string | undefined>( | ||||||||||||||||||||||
getPromptModelName(prompt, defaultConfigModelName) | ||||||||||||||||||||||
); | ||||||||||||||||||||||
const [showAll, setShowAll] = useState(true); | ||||||||||||||||||||||
const [autocompleteSearch, setAutocompleteSearch] = useState( | ||||||||||||||||||||||
getPromptModelName(prompt, defaultConfigModelName) | ||||||||||||||||||||||
); | ||||||||||||||||||||||
|
||||||||||||||||||||||
const models = useLoadModels(showAll ? "" : autocompleteSearch, getModels); | ||||||||||||||||||||||
|
||||||||||||||||||||||
const onSelectModel = (model?: string) => { | ||||||||||||||||||||||
setSelectedModel(model); | ||||||||||||||||||||||
onSetModel(model); | ||||||||||||||||||||||
}; | ||||||||||||||||||||||
|
||||||||||||||||||||||
return ( | ||||||||||||||||||||||
<Autocomplete | ||||||||||||||||||||||
placeholder="Select model" | ||||||||||||||||||||||
limit={100} | ||||||||||||||||||||||
maxDropdownHeight={200} | ||||||||||||||||||||||
rightSection={ | ||||||||||||||||||||||
selectedModel ? ( | ||||||||||||||||||||||
<Button | ||||||||||||||||||||||
size="xs" | ||||||||||||||||||||||
variant="subtle" | ||||||||||||||||||||||
mr={10} | ||||||||||||||||||||||
onClick={() => { | ||||||||||||||||||||||
onSelectModel(undefined); | ||||||||||||||||||||||
setShowAll(true); | ||||||||||||||||||||||
setAutocompleteSearch(""); | ||||||||||||||||||||||
}} | ||||||||||||||||||||||
> | ||||||||||||||||||||||
<IconX size={10} /> | ||||||||||||||||||||||
</Button> | ||||||||||||||||||||||
) : null | ||||||||||||||||||||||
} | ||||||||||||||||||||||
filter={(searchValue: string, item: AutocompleteItem) => { | ||||||||||||||||||||||
if (showAll) { | ||||||||||||||||||||||
return true; | ||||||||||||||||||||||
} | ||||||||||||||||||||||
|
||||||||||||||||||||||
const modelName: string = item.value; | ||||||||||||||||||||||
return modelName | ||||||||||||||||||||||
.toLocaleLowerCase() | ||||||||||||||||||||||
.includes(searchValue.toLocaleLowerCase().trim()); | ||||||||||||||||||||||
Comment on lines
+57
to
+65
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Alright so what's the difference between this vs. the functionality in aiconfig/python/src/aiconfig/editor/client/src/Editor.tsx Lines 31 to 40 in 9ac3b2e
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is specific to this AutoComplete UI component because needs to handle the search/filtering of the provided data on its own |
||||||||||||||||||||||
}} | ||||||||||||||||||||||
data={models} | ||||||||||||||||||||||
value={autocompleteSearch} | ||||||||||||||||||||||
onChange={(value: string) => { | ||||||||||||||||||||||
setAutocompleteSearch(value); | ||||||||||||||||||||||
setShowAll(false); | ||||||||||||||||||||||
onSelectModel(value); | ||||||||||||||||||||||
models.some((model) => { | ||||||||||||||||||||||
if (model === value) { | ||||||||||||||||||||||
setShowAll(true); | ||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ok just to clarify, this ![]() There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is mainly copied from our model selector in Workbooks and is sort of a workaround to match our desired behaviour. By default, the component will only ever show results in the dropdown that match the text in the input box. So, if the aiconfig has model set to gpt-4 when you load it, the dropdown would only ever show models with gpt-4 substring.
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Oh, I generally assumed that the user would just click the x button on the right to see everything, hmmmm. I don't think it's a blocker but feels a bit strange to me that auto-completions don't remain filtered after selected. We can do this as P2 someday |
||||||||||||||||||||||
return true; | ||||||||||||||||||||||
} | ||||||||||||||||||||||
}); | ||||||||||||||||||||||
}} | ||||||||||||||||||||||
/> | ||||||||||||||||||||||
); | ||||||||||||||||||||||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you commenting this out to add it later once the
update_model()
api is built on backend?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep