-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using prfanalyze-base to implement new analysis toolbox #70
Comments
Hi Niklas, I had a similar problem when upgrading to Python 3, we are
working on the upgrade right now and we will go on from there.
I am updating the base Linux machine, Matlab to 2020b and Python to 3, once
I have that I will ask you to try again and we can troubleshoot from there.
Regarding the base container. I would say that if your implementation is
Python based, you could duplicate the popeye gear and substitute your code
there, and keep the rest.
https://github.com/vistalab/PRFmodel/tree/master/gear/prfanalyze/popeye
I will answer back here once the update is done,
thanks!
Gari
…On Mon, Jan 10, 2022 at 3:41 PM Niklas Mueller ***@***.***> wrote:
Hello! Your prf validation framework is a great step towards being able to
compare multiple toolboxes designed for the same or similar purposes and
enables us to validate the underlying implementations. For exactly this
purpose I am trying to integrate our own prf analysis toolbox into your
validation framework and I am running into a couple of problems. The
following steps explain the present scenario:
1. A synthetic dataset has been created with the prfsynth dockerimage,
using the default settings.
2. I duplicated the structure of the other prfanalyze toolboxes in
order to create my own integration into the validation framework.
3. I adjusted the dockerfile to include the necessary scripts and to
create a conda environment in which python3 is installed in order to be
able to run our toolbox. This was initially also a problem because python2
is installed in the base image. However, we managed to circumvent this.
4. I implemented the calling of our analysis script the same way as
already implemented for the other toolboxes (i.e. using the solve.sh which
in turn is calling our analysis python script) and validated that the
script is running and doing what it is supposed to do (mainly, going from a
BIDS data structure to a BIDS data structure creating a range of output
files)
5. By downloading the actual code base for the prfmodel
<https://github.com/vistalab/prfmodel> and using the following command
I managed to start the analysis. However, this does not achieve the direct
implementation into the validation framework:
./PRFmodel/gear/prfanalyze/run_prfanalyze.sh prfpy $basedir
prfanalyze-prfpy/default_config.json
Given the above scenario the following problems arise:
1.
Using the proposed call from the wiki to create the default config file
docker run --rm -it \ -v $basedir/empty:/flywheel/v0/input:ro \ -v
$basedir:/flywheel/v0/output \ garikoitz/prfanalyze-prfpy:latest
the following output is produced:
[garikoitz/prfanalyze] No config file found. Writing default JSON file
and exiting.
cp: cannot create regular file '/flywheel/v0/input/config.json':
Read-only file system
That is, the config file that is indeed contained in the build
dockerimage (can be verified by starting the image in debug mode) is trying
to be copied into the mounted input directory (which is read-only) which
however should be copied into the output directory. As for the other
toolboxes this problem does not arise. By looking at the dockerfile for the
other toolboxes there are no extra steps needed in order to make this
process working. Can you verify that this is indeed the case?
2.
From the above arises the next problem. While trying to understand the
scripts that are responsible for the setup for the analysis (that are the
run.py and run.sh script from the prfanalyze-base image) I wanted to debug
this by changing the code. However, when trying to build the
prfanalyze-base image on my machine in order to be able to run it with
local changes I am running into version conflicts (build_output.txt
<https://github.com/vistalab/PRFmodel/files/7839539/build_output.txt>).
3.
Lastly and unfortunately, I was not able to find a comprehensive
documentation about how the integration of new toolboxes into the
validation framework is supposed to work. As the possibility to do this has
been stated in your paper showing the usefulness and results of this
framework it would be a great help to have some guidance in how to do this.
The repository for our code base can be found here
<https://github.com/niklas-mueller/prfanalyze-prfpy>.
Thank you in advance and I am very much looking forward to here back from
you.
—
Reply to this email directly, view it on GitHub
<#70>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABCZAV6ES25QI56IQFREEI3UVLVZNANCNFSM5LT2MO7A>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thanks for the quick reply! The upgrade sounds good, looking forward to that.
This is exactly what I did and why I was expecting the base functionality to work. Then however, I run into the described problems when mounting the directories. |
ah! ok we will look into that too
…On Wed, Jan 12, 2022 at 9:09 AM Niklas Mueller ***@***.***> wrote:
Thanks for the quick reply! The upgrade sounds good, looking forward to
that.
Regarding the base container. I would say that if your implementation is
Python based, you could duplicate the popeye gear and substitute your code
there, and keep the rest.
https://github.com/vistalab/PRFmodel/tree/master/gear/prfanalyze/popeye
This is exactly what I did and why I was expecting the base functionality
to work. Then however, I run into the described problems when mounting the
directories.
—
Reply to this email directly, view it on GitHub
<#70 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABCZAV5B2DPDFMQZDRR644LUVUZMBANCNFSM5LT2MO7A>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Hi Niklas,
I updated prfanalyze-base (to 3.0.1) and prfanalyze-vista (to 2.1.1), and
pushed the containers to Dockerhub.
Now both have Matlab 2020b and Conda 3, I tested with a new dataset in both
the old working containers and in the new one, and it is working fine.
The PRFmodel repository in github has been merged with the latest code and
tagged to 3.0.0 (it is the latest version in master right now, enough with
a git pull from what you have).
Do you want to start with the same tests you had and see where we are now?
I will be able to help from there I think,
Thanks for your patience!!
Gari
On Wed, Jan 12, 2022 at 9:47 AM Garikoitz Lerma-Usabiaga <
***@***.***> wrote:
… ah! ok we will look into that too
On Wed, Jan 12, 2022 at 9:09 AM Niklas Mueller ***@***.***>
wrote:
> Thanks for the quick reply! The upgrade sounds good, looking forward to
> that.
>
> Regarding the base container. I would say that if your implementation is
> Python based, you could duplicate the popeye gear and substitute your
> code there, and keep the rest.
> https://github.com/vistalab/PRFmodel/tree/master/gear/prfanalyze/popeye
>
> This is exactly what I did and why I was expecting the base functionality
> to work. Then however, I run into the described problems when mounting the
> directories.
>
> —
> Reply to this email directly, view it on GitHub
> <#70 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/ABCZAV5B2DPDFMQZDRR644LUVUZMBANCNFSM5LT2MO7A>
> .
> You are receiving this because you commented.Message ID:
> ***@***.***>
>
|
Hello! Your prf validation framework is a great step towards being able to compare multiple toolboxes designed for the same or similar purposes and enables us to validate the underlying implementations. For exactly this purpose I am trying to integrate our own prf analysis toolbox into your validation framework and I am running into a couple of problems. The following steps explain the present scenario:
./PRFmodel/gear/prfanalyze/run_prfanalyze.sh prfpy $basedir prfanalyze-prfpy/default_config.json
Given the above scenario the following problems arise:
Using the proposed call from the wiki to create the default config file
docker run --rm -it \ -v $basedir/empty:/flywheel/v0/input:ro \ -v $basedir:/flywheel/v0/output \ garikoitz/prfanalyze-prfpy:latest
the following output is produced:
[garikoitz/prfanalyze] No config file found. Writing default JSON file and exiting.
cp: cannot create regular file '/flywheel/v0/input/config.json': Read-only file system
That is, the config file that is indeed contained in the build dockerimage (can be verified by starting the image in debug mode) is trying to be copied into the mounted input directory (which is read-only) which however should be copied into the output directory. As for the other toolboxes this problem does not arise. By looking at the dockerfile for the other toolboxes there are no extra steps needed in order to make this process working. Can you verify that this is indeed the case?
From the above arises the next problem. While trying to understand the scripts that are responsible for the setup for the analysis (that are the run.py and run.sh script from the prfanalyze-base image) I wanted to debug this by changing the code. However, when trying to build the prfanalyze-base image on my machine in order to be able to run it with local changes I am running into version conflicts (build_output.txt).
Lastly and unfortunately, I was not able to find a comprehensive documentation about how the integration of new toolboxes into the validation framework is supposed to work. As the possibility to do this has been stated in your paper showing the usefulness and results of this framework it would be a great help to have some guidance in how to do this.
The repository for our code base can be found here.
Thank you in advance and I am very much looking forward to here back from you.
The text was updated successfully, but these errors were encountered: