Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

does not work out of the box #2

Open
micec24 opened this issue Apr 8, 2024 · 4 comments
Open

does not work out of the box #2

micec24 opened this issue Apr 8, 2024 · 4 comments

Comments

@micec24
Copy link

micec24 commented Apr 8, 2024

The code and library is fiddly, and does not work as expected

                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users\Micec24\AppData\Local\Programs\Python\Python312\Lib\site-packages\deprecated\classic.py", line 285, in wrapper_function
return wrapped_(*args_, **kwargs_)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Micec24\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\service_context.py", line 200, in from_defaults
embed_model = resolve_embed_model(embed_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Micec24\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\embeddings\utils.py", line 110, in resolve_embed_model
embed_model = HuggingFaceEmbedding(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Micec24\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\embeddings\huggingface\base.py", line 84, in init
raise ValueError("The model_name argument must be provided.")
ValueError: The model_name argument must be provided.

a lot of errors from python library

@DENightOne
Copy link

DENightOne commented Apr 14, 2024

I have the same issue.....have manually added all the need libraries but I get the same error.

update: Have done a little research, in llama-index, ServiceContext has to be migrated to Settings see: https://docs.llamaindex.ai/en/stable/module_guides/supporting_modules/service_context_migration/...

This is because llama-index is now version 0.10. Maybe the developer can put version numbers in the requirements file so we install the correct versions.

@sjimnar
Copy link

sjimnar commented Apr 24, 2024

Hi, I also have problems running it according to the Readme instructions, I appreciate any help.

python app.py pygame 2.5.2 (SDL 2.28.3, Python 3.11.6) Hello from the pygame community. https://www.pygame.org/contribute.html Traceback (most recent call last): File "/Users/irraz/voice_assistant_llm/app.py", line 14, in <module> ai_assistant = AIVoiceAssistant() ^^^^^^^^^^^^^^^^^^ File "/Users/irraz/voice_assistant_llm/rag/AIVoiceAssistant.py", line 17, in __init__ self._service_context = ServiceContext.from_defaults(llm=self._llm, embed_model="local") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/irraz/voice_assistant_llm/venv/lib/python3.11/site-packages/deprecated/classic.py", line 285, in wrapper_function return wrapped_(*args_, **kwargs_) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/irraz/voice_assistant_llm/venv/lib/python3.11/site-packages/llama_index/core/service_context.py", line 200, in from_defaults embed_model = resolve_embed_model(embed_model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/irraz/voice_assistant_llm/venv/lib/python3.11/site-packages/llama_index/core/embeddings/utils.py", line 110, in resolve_embed_model embed_model = HuggingFaceEmbedding( ^^^^^^^^^^^^^^^^^^^^^ File "/Users/irraz/voice_assistant_llm/venv/lib/python3.11/site-packages/llama_index/embeddings/huggingface/base.py", line 84, in __init__ raise ValueError("Themodel_nameargument must be provided.") ValueError: Themodel_name argument must be provided.

@sjimnar
Copy link

sjimnar commented Apr 24, 2024

pip install -r requirements.txt

ERROR: Could not find a version that satisfies the requirement pywin32==306 (from versions: none)
ERROR: No matching distribution found for pywin32==306

I comment this line in requeriments.txt
#pywin32==306

Now I obtain

pygame 2.5.2 (SDL 2.28.3, Python 3.11.6) Hello from the pygame community. https://www.pygame.org/contribute.html Knowledgebase created successfully! huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using tokenizers` before the fork if possible
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
_
Customer: Hello, good morning.
AI Assistant:Hello and welcome to Bangalore Kitchen! I'd be happy to help you place your order today. May I have your name, please? And what contact number shall we associate with your order? Once I have that information, could you please let me know what you would like to order from our menu? We have an Indian and English menu to choose from. Lastly, is there anything to drink with your meal? Thank you for choosing Bangalore Kitchen.
_
Traceback (most recent call last):
File "/Users/irraz/voice_assistant_llm/app.py", line 71, in main
if not record_audio_chunk(audio, stream):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/irraz/voice_assistant_llm/app.py", line 26, in record_audio_chunk
data = stream.read(1024)
^^^^^^^^^^^^^^^^^
File "/Users/irraz/voice_assistant_llm/env/lib/python3.11/site-packages/pyaudio/init.py", line 570, in read
return pa.read_stream(self._stream, num_frames,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: [Errno -9981] Input overflowed

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Users/irraz/voice_assistant_llm/app.py", line 98, in
main()
File "/Users/irraz/voice_assistant_llm/app.py", line 93, in main
stream.stop_stream()
File "/Users/irraz/voice_assistant_llm/env/lib/python3.11/site-packages/pyaudio/init.py", line 500, in stop_stream
pa.stop_stream(self._stream)
OSError: Stream not open`

@swan01
Copy link

swan01 commented May 19, 2024

Hi, firstly your video instructions and explanation was easy to follow, well done on that note.

I too ran the 'out of box' package install, no errors encountered. But when I ran app.py a whole bunch of errors were presented. Any help to resolve would be most appreciated.

Errors are as follows:

pygame 2.5.2 (SDL 2.28.3, Python 3.12.2)
Hello from the pygame community. https://www.pygame.org/contribute.html
Knowledgebase created successfully!
An error occured while synchronizing the model Systran/faster-whisper-medium.en from the Hugging Face Hub:
An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
Trying to load the model directly from the local cache, if it exists.
Traceback (most recent call last):
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 793, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 491, in _make_request
raise new_e
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 467, in _make_request
self._validate_conn(conn)
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 1099, in _validate_conn
conn.connect()
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connection.py", line 653, in connect
sock_and_verified = _ssl_wrap_socket_and_match_hostname(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connection.py", line 806, in ssl_wrap_socket_and_match_hostname
ssl_sock = ssl_wrap_socket(
^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\util\ssl
.py", line 465, in ssl_wrap_socket
ssl_sock = ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\util\ssl
.py", line 509, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 455, in wrap_socket
return self.sslsocket_class._create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 1042, in _create
self.do_handshake()
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 1320, in do_handshake
self._sslobj.do_handshake()
ConnectionResetError: [WinError 10054] An existing connection was forcibly closed by the remote host

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\kashm.venv\Lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 847, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\util\retry.py", line 470, in increment
raise reraise(type(error), error, _stacktrace)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\util\util.py", line 38, in reraise
raise value.with_traceback(tb)
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 793, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 491, in _make_request
raise new_e
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 467, in _make_request
self._validate_conn(conn)
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connectionpool.py", line 1099, in _validate_conn
conn.connect()
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connection.py", line 653, in connect
sock_and_verified = _ssl_wrap_socket_and_match_hostname(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\connection.py", line 806, in ssl_wrap_socket_and_match_hostname
ssl_sock = ssl_wrap_socket(
^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\util\ssl
.py", line 465, in ssl_wrap_socket
ssl_sock = ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\urllib3\util\ssl
.py", line 509, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 455, in wrap_socket
return self.sslsocket_class._create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 1042, in _create
self.do_handshake()
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 1320, in do_handshake
self._sslobj.do_handshake()
urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\file_download.py", line 1261, in hf_hub_download
metadata = get_hf_file_metadata(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\file_download.py", line 1667, in get_hf_file_metadata
r = _request_wrapper(
^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\file_download.py", line 385, in _request_wrapper
response = _request_wrapper(
^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\file_download.py", line 408, in _request_wrapper
response = get_session().request(method=method, url=url, **params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\utils_http.py", line 67, in send
return super().send(request, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\requests\adapters.py", line 501, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: (ProtocolError('Connection aborted.', ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None)), '(Request ID: f186fe38-9a3f-4d5a-aa72-b2f94caca2a3)')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\kashm.venv\Lib\site-packages\faster_whisper\utils.py", line 103, in download_model
return huggingface_hub.snapshot_download(repo_id, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub_snapshot_download.py", line 308, in snapshot_download
thread_map(
File "C:\Users\kashm.venv\Lib\site-packages\tqdm\contrib\concurrent.py", line 69, in thread_map
return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\tqdm\contrib\concurrent.py", line 51, in _executor_map
return list(tqdm_class(ex.map(fn, *iterables, chunksize=chunksize), **kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\tqdm\std.py", line 1169, in iter
for obj in iterable:
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\concurrent\futures_base.py", line 619, in result_iterator
yield _result_or_cancel(fs.pop())
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\concurrent\futures_base.py", line 317, in _result_or_cancel
return fut.result(timeout)
^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\concurrent\futures_base.py", line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\concurrent\futures_base.py", line 401, in __get_result
raise self._exception
File "C:\Users\kashm\AppData\Local\Programs\Python\Python312\Lib\concurrent\futures\thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub_snapshot_download.py", line 283, in _inner_hf_hub_download
return hf_hub_download(
^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\file_download.py", line 1406, in hf_hub_download
raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "c:\Users\kashm\voice_assistant_llm\app.py", line 100, in
main()
File "c:\Users\kashm\voice_assistant_llm\app.py", line 59, in main
model = WhisperModel(model_size, device="cpu", compute_type="float16", num_workers=10)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\faster_whisper\transcribe.py", line 127, in init
model_path = download_model(
^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\faster_whisper\utils.py", line 119, in download_model
return huggingface_hub.snapshot_download(repo_id, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "C:\Users\kashm.venv\Lib\site-packages\huggingface_hub_snapshot_download.py", line 235, in snapshot_download
raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: Cannot find an appropriate cached snapshot folder for the specified revision on the local disk and outgoing traffic has been disabled. To enable repo look-ups and downloads online, pass 'local_files_only=False' as input.

PS C:\Users\kashm\voice_assistant_llm>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants