Skip to content

Commit

Permalink
[editor] HuggingFaceTextGenerationParserPromptSchema
Browse files Browse the repository at this point in the history
# [editor] HuggingFaceTextGenerationParserPromptSchema

Adding the PromptSchema for HuggingFaceTextGenerationParser, with supported properties from `refine_chat_completion_params` in the parser implementation. Types obtained from https://github.com/huggingface/huggingface_hub/blob/a331e82aad1bc63038194611236db28fa013814c/src/huggingface_hub/inference/_client.py#L1206 and defaults obtained from https://huggingface.co/docs/api-inference/detailed_parameters if they're listed

<img width="1069" alt="Screenshot 2024-01-04 at 2 58 26 PM" src="https://github.com/lastmile-ai/aiconfig/assets/5060851/c049d5cb-36a5-4aea-af90-df77692519c6">

Note, this is for the default/core parser which uses inference API. We'll need to add the other prompt schemas for the extension models for gradio.

Will update the hf prompt UX to support actual 'model' in a subsequent diff
  • Loading branch information
Ryan Holinshead committed Jan 4, 2024
1 parent 86890cd commit cffb9f2
Show file tree
Hide file tree
Showing 2 changed files with 75 additions and 1 deletion.
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
import { PromptSchema } from "../../utils/promptUtils";

export const HuggingFaceTextGenerationParserPromptSchema: PromptSchema = {
// See https://github.com/huggingface/huggingface_hub/blob/a331e82aad1bc63038194611236db28fa013814c/src/huggingface_hub/inference/_client.py#L1206
// for settings and https://huggingface.co/docs/api-inference/detailed_parameters for defaults.
// The settings below are supported settings specified in the HuggingFaceTextGenerationParser
// refine_chat_completion_params implementation.
input: {
type: "string",
},
model_settings: {
type: "object",
properties: {
model: {
type: "string",
},
temperature: {
type: "number",
minimum: 0,
maximum: 1,
},
top_k: {
type: "integer",
},
top_p: {
type: "number",
minimum: 0,
maximum: 1,
},
details: {
type: "boolean",
},
stream: {
type: "boolean",
},
do_sample: {
type: "boolean",
},
max_new_tokens: {
type: "integer",
},
best_of: {
type: "integer",
},
repetition_penalty: {
type: "number",
minimum: 0,
maximum: 1,
},
return_full_text: {
type: "boolean",
},
seed: {
type: "integer",
},
stop_sequences: {
type: "array",
items: {
type: "string",
},
},
truncate: {
type: "integer",
},
typical_p: {
type: "number",
},
watermark: {
type: "boolean",
},
},
},
};
3 changes: 2 additions & 1 deletion python/src/aiconfig/editor/client/src/utils/promptUtils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import { OpenAIChatVisionModelParserPromptSchema } from "../shared/prompt_schema
import { DalleImageGenerationParserPromptSchema } from "../shared/prompt_schemas/DalleImageGenerationParserPromptSchema";
import { PaLMTextParserPromptSchema } from "../shared/prompt_schemas/PaLMTextParserPromptSchema";
import { PaLMChatParserPromptSchema } from "../shared/prompt_schemas/PaLMChatParserPromptSchema";
import { HuggingFaceTextGenerationParserPromptSchema } from "../shared/prompt_schemas/HuggingFaceTextGenerationParserPromptSchema";

/**
* Get the name of the model for the specified prompt. The name will either be specified in the prompt's
Expand Down Expand Up @@ -68,7 +69,7 @@ export const PROMPT_SCHEMAS: Record<string, PromptSchema> = {
"dall-e-3": DalleImageGenerationParserPromptSchema,

// HuggingFaceTextGenerationParser
// "HuggingFaceTextGenerationParser":
HuggingFaceTextGenerationParser: HuggingFaceTextGenerationParserPromptSchema,

// PaLMTextParser
"models/text-bison-001": PaLMTextParserPromptSchema,
Expand Down

0 comments on commit cffb9f2

Please sign in to comment.