You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
In one of our processing scripts, we currently perform inference using a MONAI model, and set torch.device to 'cuda' to perform inference on GPU. We use device when loading the models weights, as well as for the sw_device argument of the sliding_window_inference function.
Initially, GPU usage was working OK, I think? But later, it seems we forgot to install the necessary optional dependencies, because we ran into the following error:
Naturally, the thought is to pip install cupy, but this fails, because the actual package is called cupy-cuda12x.
Looking at the list of potential "extras" options, it's not entirely clear how to pick the correct extra to solve this issue. We started with monai[all] just to troubleshoot, and this does fix the problem, but it seems like overkill, installing dependencies we do not actually need. With some guesswork, we learned that monai[cucim] will install cupy. But, even then, it is not 100% clear from the documentation alone if this will install all of the necessary dependencies to perform inference without error.
Describe the solution you'd like
I think it would make sense if the installation section had an "installing MONAI for GPU usage" section that explains that, when is_cuda is true, MONAI will try to use CuPY, and thus [insert blank] dependencies are necessary (monai[cucim])?
It might also help to have an extra specifically called monai[cuda] to make it more discoverable/intuitive. But, if this turns out to just be an alias for monai[cucim], then maybe it's not necessary.
Thank you!
The text was updated successfully, but these errors were encountered:
Ah, I see now that there is a small section about CUDA that reads:
The installation commands below usually end up installing CPU variant of PyTorch. To install GPU-enabled PyTorch:
Install the latest NVIDIA driver.
Check PyTorch Official Guide for the recommended CUDA versions. For Pip package, the user needs to download the CUDA manually, install it on the system, and ensure CUDA_PATH is set properly.
Continue to follow the guide and install PyTorch.
Install MONAI using one the ways described below.
In that case I would suggest to give this checklist its own header, and potentially mention cupy here if necessary.
cucim-cu12; platform_system == "Linux" and python_version >= "3.9" and python_version <= "3.10"
It turns out that we are using Python 3.9 on Linux, so maybe this is only a corner case for our specific purposes, and that Python 3.11 and above do not require this?
Is your feature request related to a problem? Please describe.
In one of our processing scripts, we currently perform inference using a MONAI model, and set
torch.device
to'cuda'
to perform inference on GPU. We usedevice
when loading the models weights, as well as for thesw_device
argument of thesliding_window_inference
function.Initially, GPU usage was working OK, I think? But later, it seems we forgot to install the necessary optional dependencies, because we ran into the following error:
Naturally, the thought is to
pip install cupy
, but this fails, because the actual package is calledcupy-cuda12x
.Looking at the list of potential "extras" options, it's not entirely clear how to pick the correct extra to solve this issue. We started with
monai[all]
just to troubleshoot, and this does fix the problem, but it seems like overkill, installing dependencies we do not actually need. With some guesswork, we learned thatmonai[cucim]
will installcupy
. But, even then, it is not 100% clear from the documentation alone if this will install all of the necessary dependencies to perform inference without error.Describe the solution you'd like
I think it would make sense if the installation section had an "installing MONAI for GPU usage" section that explains that, when
is_cuda
is true, MONAI will try to use CuPY, and thus [insert blank] dependencies are necessary (monai[cucim]
)?It might also help to have an extra specifically called
monai[cuda]
to make it more discoverable/intuitive. But, if this turns out to just be an alias formonai[cucim]
, then maybe it's not necessary.Thank you!
The text was updated successfully, but these errors were encountered: