Skip to content

Demonstrate running Python code within a conda env, as a Slurm job array

License

Notifications You must be signed in to change notification settings

rcampbel/ensemble

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Nov 1, 2024
88c8c14 · Nov 1, 2024

History

2 Commits
Nov 1, 2024
Nov 1, 2024
Nov 1, 2024
Nov 1, 2024
Nov 1, 2024
Nov 1, 2024

Repository files navigation

ensemble

Demonstrate running Python code within a conda env, as a Slurm job array.

  • Multiple instances of the same code is run with different parameters.
  • Each run only differs in that it has a different value for "task ID".
  • The task ID can be used to select a predefined paramter set, create random numbers, or in other ways.

To create the conda environment

conda env create -f environment.yaml

To test a single run in the foreground

source activate ensemble
python example.py --task_id  1

To run as a job array on a Slurm cluster

sbatch job_script.sh

Notes

  1. Job script uses special comments to set up Slurm job array.
  2. When submitted to Slurm, multiple sub-jobs are created. Each has a different value for $SLURM_ARRAY_TASK_ID.
  3. When each sub-job runs the job script loads our conda environment by name.
  4. Job script then runs our Python script, using task ID value as command line parameter.
  5. Python code accepts the task ID value as a command line parameter and uses it to figure out which of its own interal parameter values to use.
  6. The job scripts sets the Slurm "account" (queue) to "standby". Change this as needed.

About

Demonstrate running Python code within a conda env, as a Slurm job array

Resources

License

Stars

Watchers

Forks

Releases

No releases published