You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm curious what the peak memory consumption for an input file is. If a bold file is 1gb, will the peak memory being 2gb/5gb/10g? Are there large variables being held in memory longer than they need to be?
The text was updated successfully, but these errors were encountered:
This is a great question and I would consider this an upstream issue in nilearn.
See this relevant issue on memory profiling of masker: nilearn/nilearn#3399
The memory consumption will depend on
size of input file
atlas type (segregated or probability)
did the masker object performs resampling
I consider this an upstream issue, as the current software depends on the nilearn masker object
I'm curious what the peak memory consumption for an input file is. If a bold file is 1gb, will the peak memory being 2gb/5gb/10g? Are there large variables being held in memory longer than they need to be?
The text was updated successfully, but these errors were encountered: