-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
does not work out of the box #2
Comments
I have the same issue.....have manually added all the need libraries but I get the same error. update: Have done a little research, in llama-index, ServiceContext has to be migrated to Settings see: https://docs.llamaindex.ai/en/stable/module_guides/supporting_modules/service_context_migration/... This is because llama-index is now version 0.10. Maybe the developer can put version numbers in the requirements file so we install the correct versions. |
Hi, I also have problems running it according to the Readme instructions, I appreciate any help.
|
pip install -r requirements.txt ERROR: Could not find a version that satisfies the requirement pywin32==306 (from versions: none) I comment this line in requeriments.txt Now I obtain
During handling of the above exception, another exception occurred: Traceback (most recent call last): |
Hi, firstly your video instructions and explanation was easy to follow, well done on that note. I too ran the 'out of box' package install, no errors encountered. But when I ran app.py a whole bunch of errors were presented. Any help to resolve would be most appreciated. Errors are as follows: pygame 2.5.2 (SDL 2.28.3, Python 3.12.2) During handling of the above exception, another exception occurred: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): PS C:\Users\kashm\voice_assistant_llm> |
The code and library is fiddly, and does not work as expected
File "C:\Users\Micec24\AppData\Local\Programs\Python\Python312\Lib\site-packages\deprecated\classic.py", line 285, in wrapper_function
return wrapped_(*args_, **kwargs_)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Micec24\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\service_context.py", line 200, in from_defaults
embed_model = resolve_embed_model(embed_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Micec24\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\embeddings\utils.py", line 110, in resolve_embed_model
embed_model = HuggingFaceEmbedding(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Micec24\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\embeddings\huggingface\base.py", line 84, in init
raise ValueError("The
model_name
argument must be provided.")ValueError: The
model_name
argument must be provided.a lot of errors from python library
The text was updated successfully, but these errors were encountered: