Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move should_stream logic into a helper function #861

Open
rossdanlm opened this issue Jan 10, 2024 · 2 comments
Open

Move should_stream logic into a helper function #861

rossdanlm opened this issue Jan 10, 2024 · 2 comments
Labels

Comments

@rossdanlm
Copy link
Contributor

rossdanlm commented Jan 10, 2024

Comment from #851 (comment)

Make this into a separate helper function making it super clear what the two checks are:

  1. Check that inference options is set to true when calling config.run() --> ie: config.run(prompt_name, options=InferenceOptions(stream=True, stream_callback=my_stream_callback)
  2. Check that the stream option in the completion params (which we get from prompt settings) is not explicitly set to false

Testing

You will first need to pip install to this local directory (go to /aiconfig repo and type pip install -e ., and then when you do pip list | grep aiconfig make sure that whatever python-aiconfig links to your local path)
Please follow instructions similar to #851 to test

@rossdanlm rossdanlm changed the title Make should_stream logic into a helper function Move should_stream logic into a helper function Jan 10, 2024
@rossdanlm rossdanlm added type: enhancement New feature or request good first issue Good for newcomers pri: low effort: mid labels Jan 10, 2024
@deepanshu-byte
Copy link

Hi @rossdanlm, I'm interested in tackling this issue. I have a couple of questions:

  1. Where would you recommend defining the new function, and what should be its scope within the project?
  2. Are there specific test cases expected for the new function I'll be creating?

@rossdanlm
Copy link
Contributor Author

rossdanlm commented Feb 24, 2024

Hey @deepanshu-byte , I'm super sorry for the late reply, I just missed this and going through all the tasks now. Pls feel free to message me on "Rossdan Craig" on messenger if I miss any messages from you.

  1. Where would you recommend defining the new function, and what should be its scope within the project?

Great question! It's acceptable to simply make a utils file within the file itself, but personally I would prefer having this defined in the base class of "ModelParser" itself (put it around here: https://github.com/lastmile-ai/aiconfig/blob/main/python/src/aiconfig/model_parser.py#L198) and then we simply call self.should_stream(options, completion_params)

  1. Are there specific test cases expected for the new function I'll be creating?

I think simply manually testing the flow that I outlined in #851 should be sufficient! However please note that you will need to run pip3 install -r <filename> where filename is the path to this file: https://github.com/lastmile-ai/aiconfig/blob/main/extensions/HuggingFace/python/requirements.txt, and also that we've made a few modifications to the editor since then so you should follow the test plan from here instead: #1245 (make sure to run yarn && yarn build from the aiconfig/python/src/editor/client file first!)

One final note is that I'd like us to update EVERY relevant callsite that checks for this! There are a lot, so we can do this in a followup PR after the first refactoring is complete!

Once again I'm super sorry for the late reply. If you became busy since then no worries, pls feel free to message me on Messenger if you have any other questions!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants