How does one make sure that the python submission

2019-07-10 03:15发布

I have a python submission script that I run with sbatch using slurm:

sbatch batch.py

when I do this things do not work properly because I assume, the batch.py process does not inherit the right environment variables. Thus instead of running batch.py from where the sbatch command was done, its ran from somewhere else (/ I believe). I have managed to fix this by doing wrapping the python script with a bash script:

#!/usr/bin/env bash
cd path/to/scripts
python script.py

this temporary hack sort of works it seems though it seems that it avoids the question all together rather than addressing it. Does someone know how to fix this in a better way?

I know for example, that in docker the -w or -WORKDIR exists so that the docker container knows where its suppose to be at. I was wondering if something like that existed for slurm.

1条回答
劫难
2楼-- · 2019-07-10 04:02

Slurm is designed to push the user's environment at submit time to the job, except for variables explicitly disabled by the user or the system administrator.

But the way the script is run is as follows: the script is copied on the master node of the allocation in a Slurm specific directory and run from there, with the $PWD set to the directory where the sbatch command was run.

You can see that with a simple script like this one:

$ cat t.sh
#!/bin/bash
#
#SBATCH --job-name=test_ms
#SBATCH --output=res_ms.txt

echo $PWD
dirname $(readlink -f "$0")

$ sbatch t.sh
Submitted batch job 1109631
$ cat res_ms.txt
/home/damienfrancois/
/var/spool/slurm/job1109631

One consequence is that Python scripts that import modules in the current directory fail to do so. The workaround is then to explicitly add sys.path.append(os.getcwd()) before the failing imports.

查看更多
登录 后发表回答