Start a container using the command below:
docker run -ti \
-p 8888:8888 \
elodiegermani/open_pipeline
On this command line, you need to add volumes to be able to link with your local files (original dataset and git repository). If you stored the original dataset in data/original
, just make a volume with the narps_open_pipelines
directory:
docker run -ti \
-p 8888:8888 \
-v /users/egermani/Documents/narps_open_pipelines:/home/ \
elodiegermani/open_pipeline
If it is in another directory, make a second volume with the path to your dataset:
docker run -ti \
-p 8888:8888 \
-v /Users/egermani/Documents/narps_open_pipelines:/home/ \
-v /Users/egermani/Documents/data/NARPS/:/data/ \
elodiegermani/open_pipeline
After that, your container will be launched!
docker start [name_of_the_container]
docker ps
docker exec -ti [name_of_the_container] bash
OR
docker attach [name_of_the_container]
source activate neuro
jupyter notebook --port=8888 --no-browser --ip=0.0.0.0
Verify it still runs :
docker ps -l
If your container is in the list, run :
docker start [name_of_the_container]
Else, relaunch it with :
docker run -ti \
-p 8888:8888 \
-v /home/egermani:/home \
[name_of_the_image]
from nipype.interfaces import spm
matlab_cmd = '/opt/spm12-r7771/run_spm12.sh /opt/matlabmcr-2010a/v713/ script'
spm.SPMCommand.set_mlab_paths(matlab_cmd=matlab_cmd, use_mcr=True)