We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When I run the example script in readme, the error would happen:
The text was updated successfully, but these errors were encountered:
I have been running into the same issue (Python 3.10.6 x64)
Sorry, something went wrong.
Some buddy gave me the suggestion. Would you like to have a try? I can only do that after back home LOL
can we just convert the model to qnn format locally? it's impossible to upload FP32 * 8B Llama model to server without any internet error
No branches or pull requests
When I run the example script in readme, the error would happen:
![微信图片_20241117011830](https://private-user-images.githubusercontent.com/45281759/386924433-0a5b9808-2d35-4db4-a644-c3eb81679c76.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzg3ODg5MjIsIm5iZiI6MTczODc4ODYyMiwicGF0aCI6Ii80NTI4MTc1OS8zODY5MjQ0MzMtMGE1Yjk4MDgtMmQzNS00ZGI0LWE2NDQtYzNlYjgxNjc5Yzc2LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMDUlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjA1VDIwNTAyMlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWU3Mzk3ZWUyNzRjYjMzZDQzMjUxNjNkZDk2NmQzYmZhZGIyZjg2ODJkYzE5ZTEyNTA3NDRlNTYzMTE1NDViMjYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.RnZ_tIGOqcnICiBvxbgR-sYFUY_O0VPL7aU4ZjGaT2U)
The text was updated successfully, but these errors were encountered: