Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE]: libcudnn.so error - fixed by installing cudnn package (Linux) #197

Open
4 of 6 tasks
jamnox opened this issue Feb 7, 2025 · 0 comments
Open
4 of 6 tasks

Comments

@jamnox
Copy link

jamnox commented Feb 7, 2025

Voice Changer Version

(GIT PULL)

Operational System

Linux (Arch Linux AND Alpine Linux)

GPU

NVIDIA GeForce RTX 3050 12GB

Read carefully and check the options

  • I've tried to Clear Settings
  • Sample/Default Models are working
  • I've tried to change the Chunk Size
  • GUI was successfully launched
  • I've read the tutorial
  • I've tried to extract to another folder (or re-extract) the .zip file

Model Type

MMVC, RVC

Issue Description

Hi,

Running the voice changer on Linux without first installing the cudnn package (depending on distribution) will cause an error. I opened this issue to see if installing the CUDA toolkit on Linux for anyone else failed and the error message persisted and maybe save time for others struggling. I would imagine cuda and cudnn have differences

Application Screenshot

No response

Logs on console

2025-02-07 13:51:11.817903475 [E:onnxruntime:Default, provider_bridge_ort.cc:1862 TryGetProviderInfo_CUDA] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1539 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcudnn_adv.so.9: cannot open shared object file: No such file or directory

2025-02-07 13:51:11.817924805 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:993 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Require cuDNN 9.* and CUDA 12.*. Please install all dependencies as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported. 2025-02-07 13:51:11,930 INFO [PitchExtractorManager] Loading pitch extractor rmvpe_onnx

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant