-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[typescript] Use OutputDataWithValue type for function calls #610
Conversation
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` aalready returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. Not sure how to run the `demo.ts` file which would also be a reasonable test to ensure everything there works too For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` aalready returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. Not sure how to run the `demo.ts` file which would also be a reasonable test to ensure everything there works too For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
…ata (#603) [typescript] Save output.data with text content instead of response data This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
9b6fa5a
to
e8223f3
Compare
See comment in #610 (comment) Now we're just going to directly check for the type being a pure string and not embedded within the `OutputData` object. I also renamed `OutputData` --> `OutputDataWithValue` to make this more explicit and easier to understand Rebuilt the schema using `yarn gen-schema` from `aiconfig/typescript` dir
4eb3e9d
to
90f4c1d
Compare
e5849e5
to
2aea134
Compare
…ta (#637) Update output schema to not use {"string": string_value} as output data See comment in #610 (comment) Now we're just going to directly check for the type being a pure string and not embedded within the `OutputData` object. I also renamed `OutputData` --> `OutputDataWithValue` to make this more explicit and easier to understand Rebuilt the schema using `yarn gen-schema` from `aiconfig/typescript` dir --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/637). * #610 * __->__ #637
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few subtle bugs, so requesting changes. Mostly looks good
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving to unblock after the comments are addressed
After discussion and based on comments, we are going to keep pure string form as is, but still need to support non-pure string types to make sure that they're supported. Specifically images (which we don't have on ts) and function calls. I didn't need to update the hugging face files because those already are just pure text ##Test plan Pass automated tests, running these commands from `aiconfig` top-level dir ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` Also run `yarn test` from typescript dir
Wow ok finally done updating feedback and ready to land |
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608 Actually like typescript, the only model parser we need to support for function calling is `openai.py`
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608 Actually like typescript, the only model parser we need to support for function calling is `openai.py`
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608 Actually like typescript, the only model parser we need to support for function calling is `openai.py`
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608 Actually like typescript, the only model parser we need to support for function calling is `openai.py`
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608 Actually like typescript, the only model parser we need to support for function calling is `openai.py`
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608 Actually like typescript, the only model parser we need to support for function calling is `openai.py`
[python] Use OutputDataWithValue type for function calls This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in #608 Actually like typescript, the only model parser we need to support for function calling is `openai.py`
This comes after Sarmad's schema updates in lastmile-ai#589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in lastmile-ai#610 which converts from pure text --> `OutputData` format. We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type. Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir: ``` npx ts-node typescript/demo/function-call-stream.ts npx ts-node typescript/demo/demo.ts npx ts-node typescript/demo/test-hf.ts ``` For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed ## TODO I still need to add function call support directly to `OutputData` format. See
See comment in lastmile-ai#610 (comment) Now we're just going to directly check for the type being a pure string and not embedded within the `OutputData` object. I also renamed `OutputData` --> `OutputDataWithValue` to make this more explicit and easier to understand Rebuilt the schema using `yarn gen-schema` from `aiconfig/typescript` dir
This is extending lastmile-ai#605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types Same as last diff (lastmile-ai#610) except now for python files instead of typescript Don't need to do this for the image outputs since I already did that in lastmile-ai#608 Actually like typescript, the only model parser we need to support for function calling is `openai.py`
[typescript] Use OutputDataWithValue type for function calls
After discussion and based on comments, we are going to keep pure string form as is, but still need to support non-pure string types to make sure that they're supported. Specifically images (which we don't have on ts) and function calls.
I didn't need to update the hugging face files because those already are just pure text
##Test plan
Pass automated tests, running these commands from
aiconfig
top-level dirAlso run
yarn test
from typescript dir