Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[typescript] Use OutputDataWithValue type for function calls #610

Merged
merged 1 commit into from
Dec 27, 2023
Merged

Conversation

rossdanlm
Copy link
Contributor

@rossdanlm rossdanlm commented Dec 25, 2023

[typescript] Use OutputDataWithValue type for function calls

After discussion and based on comments, we are going to keep pure string form as is, but still need to support non-pure string types to make sure that they're supported. Specifically images (which we don't have on ts) and function calls.

I didn't need to update the hugging face files because those already are just pure text

##Test plan
Pass automated tests, running these commands from aiconfig top-level dir

npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts

Also run yarn test from typescript dir

rossdanlm pushed a commit that referenced this pull request Dec 25, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` aalready returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. Not sure how to run the `demo.ts` file which would also be a reasonable test to ensure everything there works too


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed
@rossdanlm rossdanlm marked this pull request as ready for review December 25, 2023 23:02
rossdanlm pushed a commit that referenced this pull request Dec 26, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` aalready returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. Not sure how to run the `demo.ts` file which would also be a reasonable test to ensure everything there works too


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed
rossdanlm pushed a commit that referenced this pull request Dec 26, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
rossdanlm pushed a commit that referenced this pull request Dec 26, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
rossdanlm pushed a commit that referenced this pull request Dec 26, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
rossdanlm pushed a commit that referenced this pull request Dec 26, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
rossdanlm pushed a commit that referenced this pull request Dec 26, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This comes after Sarmad's schema updates in #589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in #610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
rossdanlm added a commit that referenced this pull request Dec 27, 2023
…ata (#603)

[typescript] Save output.data with text content instead of response data




This comes after Sarmad's schema updates in
#589. To keep diffs small
and easier to review, this simply converts from model-specific outputs
--> pure text. I have a diff in
#610 which converts from
pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py`
already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai.
I also ran the typescript demos to make sure that they still work. Run
these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just
changed `response` to `response.generated_text`), while `llama.ts`
already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData`
format. See
@rossdanlm rossdanlm marked this pull request as draft December 27, 2023 06:41
@rossdanlm rossdanlm force-pushed the pr610 branch 3 times, most recently from 9b6fa5a to e8223f3 Compare December 27, 2023 15:13
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
See comment in #610 (comment)

Now we're just going to directly check for the type being a pure string and not embedded within the `OutputData` object. I also renamed `OutputData` --> `OutputDataWithValue` to make this more explicit and easier to understand

Rebuilt the schema using `yarn gen-schema` from `aiconfig/typescript` dir
@rossdanlm rossdanlm force-pushed the pr610 branch 2 times, most recently from 4eb3e9d to 90f4c1d Compare December 27, 2023 15:50
@rossdanlm rossdanlm changed the title [typescript] Save output.data with OutputData type instead of pure string [typescript] Use OutputDataWithValue type for function calls Dec 27, 2023
@rossdanlm rossdanlm force-pushed the pr610 branch 2 times, most recently from e5849e5 to 2aea134 Compare December 27, 2023 15:55
@rossdanlm rossdanlm marked this pull request as ready for review December 27, 2023 15:56
rossdanlm added a commit that referenced this pull request Dec 27, 2023
…ta (#637)

Update output schema to not use {"string": string_value} as output data

See comment in
#610 (comment)

Now we're just going to directly check for the type being a pure string
and not embedded within the `OutputData` object. I also renamed
`OutputData` --> `OutputDataWithValue` to make this more explicit and
easier to understand

Rebuilt the schema using `yarn gen-schema` from `aiconfig/typescript`
dir

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/637).
* #610
* __->__ #637
Copy link
Contributor

@saqadri saqadri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few subtle bugs, so requesting changes. Mostly looks good

Copy link
Contributor

@saqadri saqadri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving to unblock after the comments are addressed

After discussion and based on comments, we are going to keep pure string form as is, but still need to support non-pure string types to make sure that they're supported. Specifically images (which we don't have on ts) and function calls.

I didn't need to update the hugging face files because those already are just pure text

##Test plan
Pass automated tests, running these commands from `aiconfig` top-level dir
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```

Also run `yarn test` from typescript dir
@rossdanlm
Copy link
Contributor Author

Wow ok finally done updating feedback and ready to land

@rossdanlm rossdanlm merged commit f35d6a4 into main Dec 27, 2023
3 checks passed
@rossdanlm rossdanlm deleted the pr610 branch December 27, 2023 20:48
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in #608
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in #608
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in #608

Actually like typescript, the only model parser we need to support for function calling is `openai.py`
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in #608

Actually like typescript, the only model parser we need to support for function calling is `openai.py`
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in #608

Actually like typescript, the only model parser we need to support for function calling is `openai.py`
rossdanlm pushed a commit that referenced this pull request Dec 27, 2023
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in #608

Actually like typescript, the only model parser we need to support for function calling is `openai.py`
rossdanlm pushed a commit that referenced this pull request Dec 28, 2023
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in #608

Actually like typescript, the only model parser we need to support for function calling is `openai.py`
rossdanlm pushed a commit that referenced this pull request Dec 28, 2023
This is extending #605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in #608

Actually like typescript, the only model parser we need to support for function calling is `openai.py`
rossdanlm added a commit that referenced this pull request Dec 28, 2023
[python] Use OutputDataWithValue type for function calls






This is extending #605 where
I converted from model-specific response types --> pure strings. Now I'm
going to also support function calls --> OutputData types

Same as last diff (#610)
except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in
#608

Actually like typescript, the only model parser we need to support for
function calling is `openai.py`
Victor-Su-Ortiz pushed a commit to Victor-Su-Ortiz/aiconfig that referenced this pull request Jan 4, 2024
This comes after Sarmad's schema updates in lastmile-ai#589. To keep diffs small and easier to review, this simply converts from model-specific outputs --> pure text. I have a diff in lastmile-ai#610 which converts from pure text --> `OutputData` format.


We only needed to update the `hf.py` and `openai.py`, because `palm.py` already returns output in the form of `string | null` type.

Ran yarn automated tests, but there aren't any specifically for openai. I also ran the typescript demos to make sure that they still work. Run these commands from `aiconfig` top-level dir:
```
npx ts-node typescript/demo/function-call-stream.ts
npx ts-node typescript/demo/demo.ts
npx ts-node typescript/demo/test-hf.ts
```


For the extensions, we only have typescript for `hf.ts` (trivial: just changed `response` to `response.generated_text`), while `llama.ts` already outputs it in text format so no changes needed


## TODO
I still need to add function call support directly to `OutputData` format. See
Victor-Su-Ortiz pushed a commit to Victor-Su-Ortiz/aiconfig that referenced this pull request Jan 4, 2024
See comment in lastmile-ai#610 (comment)

Now we're just going to directly check for the type being a pure string and not embedded within the `OutputData` object. I also renamed `OutputData` --> `OutputDataWithValue` to make this more explicit and easier to understand

Rebuilt the schema using `yarn gen-schema` from `aiconfig/typescript` dir
Victor-Su-Ortiz pushed a commit to Victor-Su-Ortiz/aiconfig that referenced this pull request Jan 4, 2024
This is extending lastmile-ai#605 where I converted from model-specific response types --> pure strings. Now I'm going to also support function calls --> OutputData types

Same as last diff (lastmile-ai#610) except now for python files instead of typescript

Don't need to do this for the image outputs since I already did that in lastmile-ai#608

Actually like typescript, the only model parser we need to support for function calling is `openai.py`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants