You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I followed the instructions in the README file here. This error didn't occur with either Llama 3-8b-chat or Llama 3.2 3b chat. The binaries were generated for both of those versions, but not for the Llama 3.1 8b chat quantized version.
After entering the following command, got the error pasted below
Traceback (most recent call last):
File "/home/apps/miniconda3/envs/llm_on_genie/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/apps/miniconda3/envs/llm_on_genie/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/home/apps/miniconda3/envs/llm_on_genie/lib/python3.10/site-packages/qai_hub_models/models/llama_v3_1_8b_chat_quantized/export.py", line 57, in <module>
main()
File "/home/apps/miniconda3/envs/llm_on_genie/lib/python3.10/site-packages/qai_hub_models/models/llama_v3_1_8b_chat_quantized/export.py", line 34, in main
parser = export_parser(
TypeError: export_parser() got an unexpected keyword argument 'supports_precompiled_qnn_onnx'
Let me know if you need additional information.
The text was updated successfully, but these errors were encountered:
I followed the instructions in the README file here. This error didn't occur with either Llama 3-8b-chat or Llama 3.2 3b chat. The binaries were generated for both of those versions, but not for the Llama 3.1 8b chat quantized version.
After entering the following command, got the error pasted below
Error:
Let me know if you need additional information.
The text was updated successfully, but these errors were encountered: