You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Make this into a separate helper function making it super clear what the two checks are:
Check that inference options is set to true when calling config.run() --> ie: config.run(prompt_name, options=InferenceOptions(stream=True, stream_callback=my_stream_callback)
Check that the stream option in the completion params (which we get from prompt settings) is not explicitly set to false
Testing
You will first need to pip install to this local directory (go to /aiconfig repo and type pip install -e ., and then when you do pip list | grep aiconfig make sure that whatever python-aiconfig links to your local path)
Please follow instructions similar to #851 to test
The text was updated successfully, but these errors were encountered:
rossdanlm
changed the title
Make should_stream logic into a helper function
Move should_stream logic into a helper function
Jan 10, 2024
Hey @deepanshu-byte , I'm super sorry for the late reply, I just missed this and going through all the tasks now. Pls feel free to message me on "Rossdan Craig" on messenger if I miss any messages from you.
Where would you recommend defining the new function, and what should be its scope within the project?
Are there specific test cases expected for the new function I'll be creating?
I think simply manually testing the flow that I outlined in #851 should be sufficient! However please note that you will need to run pip3 install -r <filename> where filename is the path to this file: https://github.com/lastmile-ai/aiconfig/blob/main/extensions/HuggingFace/python/requirements.txt, and also that we've made a few modifications to the editor since then so you should follow the test plan from here instead: #1245 (make sure to run yarn && yarn build from the aiconfig/python/src/editor/client file first!)
One final note is that I'd like us to update EVERY relevant callsite that checks for this! There are a lot, so we can do this in a followup PR after the first refactoring is complete!
Once again I'm super sorry for the late reply. If you became busy since then no worries, pls feel free to message me on Messenger if you have any other questions!
Comment from #851 (comment)
Make this into a separate helper function making it super clear what the two checks are:
config.run(prompt_name, options=InferenceOptions(stream=True, stream_callback=my_stream_callback)
Testing
You will first need to pip install to this local directory (go to /aiconfig repo and type
pip install -e .
, and then when you dopip list | grep aiconfig
make sure that whateverpython-aiconfig
links to your local path)Please follow instructions similar to #851 to test
The text was updated successfully, but these errors were encountered: