You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 28, 2025. It is now read-only.
I've noticed that regardless of the batch size I set or the changes I make, the memory usage on my GPU stays fixed at 2064MB. I use a 4090. It couldn't be a GPU limitation. Could this be due to a specific setting or limitation in the model or framework I'm using? Any insights would be appreciated!
The text was updated successfully, but these errors were encountered:
Kaye-Sum
changed the title
whyWhy does my GPU memory usage always stay at 2064MB?
Why does my GPU memory usage always stay at 2064MB?
Nov 6, 2024
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I've noticed that regardless of the batch size I set or the changes I make, the memory usage on my GPU stays fixed at 2064MB. I use a 4090. It couldn't be a GPU limitation. Could this be due to a specific setting or limitation in the model or framework I'm using? Any insights would be appreciated!
The text was updated successfully, but these errors were encountered: