You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In order to allow to run more experiments of the same model safely in parallel, we need to implement lockfiles. These files would have all the necessary fields in order for check_params to work and block running more experiments with the same parameters concurrently. One consideration to keep in mind, the start of the jobs in the array should be somewhat randomized at the beginning as there is potential for collision, however I have yet to found a simple argument for this in the SLURM docummention.
The text was updated successfully, but these errors were encountered:
The lockfiles can either be deleted on failure or kept. Both cases are easily supported in evaluation, which throws out samples without scores. The choice is probably method dependent as some of them are more deterministic than others.
In order to allow to run more experiments of the same model safely in parallel, we need to implement lockfiles. These files would have all the necessary fields in order for
check_params
to work and block running more experiments with the same parameters concurrently. One consideration to keep in mind, the start of the jobs in the array should be somewhat randomized at the beginning as there is potential for collision, however I have yet to found a simple argument for this in the SLURM docummention.The text was updated successfully, but these errors were encountered: