-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] llm_on_genie install instruction #24
Comments
Please also add instructions to login-in first to HuggingFace and provide an access token in the CMD.
and then paste a token that you have created in HugggingFace when asked. |
System used Python: AMD64 , v3.10 SWAP: can be expanded to 80GB (to ensure all would fit in mem). |
cp ai-hub-apps/tutorials/llm_on_genie/configs/htp/htp_backend_ext_config.json.template genie_bundle/htp_backend_ext_config.json is not a windows cmd command (and slashes are different). so for Snapdragon X Elite users, please extend the instructions. |
Why do I need the tokenizer of https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct/tree/main what is the difference with that of https://huggingface.co/meta-llama/Llama-3.2-3B/tree/main for llama-v3-2-3b-chat-quantized? |
For some reason $QNN_SDK_ROOT is not automatically defined. Also not in PowerShell, I just learned that you assume PowerShell and not CMD ;) So maybe telling that C:\Qualcomm\AIStack\QAIRT\2.28.2.241116\ is the root folder might be nice, since it's not the default Program files or Microsoft SDK folder. |
When compiling the ChatApp, I get the following error: Severity Code Description Project File Line Suppression State Details Severity Code Description Project File Line Suppression State Details While
gave the correct response and utilized the NPU using 3,2GB of memory for the 3B model of llama 3.2.
It produces the same answer when you start with the same prompt. |
Thanks for feedback @BrickDesignerNL for #24 (comment) , Possibly, path is not set correctly leading to not able to find Genie header files. Could you please check if following file exists?
|
@bhushan23 Thank you! I have followed the instructions on There it says
Indeed GenieCommon.h I've copied them (now) to the main folder of ChatApp, as I assume that should be done. I now get:
This file can be found in Putting that one into the root of ChatApp doesn't solve this. |
@bhushan23 / @mestrona-3 Do you have a tip how to fix the last issue? |
that one is easy. if you have configured QNN_SDK_ROOT, update the VC++ Directories setting |
https://github.com/quic/ai-hub-apps/tree/main/tutorials/llm_on_genie
After:
python3.10 -m venv llm_on_genie_venv
it says
source llm_on_genie_venv/bin/activate
typically on Windows it's
llm_on_genie_venv\Scripts\activate
The text was updated successfully, but these errors were encountered: