Skip to content

Instantly share code, notes, and snippets.

@dutta-alankar
Created July 3, 2025 07:37
Show Gist options
  • Select an option

  • Save dutta-alankar/67a92b25c573fd026ed5cedf1ebf1ad4 to your computer and use it in GitHub Desktop.

Select an option

Save dutta-alankar/67a92b25c573fd026ed5cedf1ebf1ad4 to your computer and use it in GitHub Desktop.
Slurm script for Freya on MPCDF
#!/bin/bash
# Standard output and error:
#SBATCH -o ./tjob.%x.out.%j
#SBATCH -e ./tjob.%x.err.%j
#SBATCH --job-name="CCtest"
#
# Number of nodes and MPI tasks per node:
#SBATCH --nodes=15
#SBATCH --ntasks-per-node=40
#
#SBATCH --mail-type=BEGIN
#SBATCH --mail-user=alankard@mpa-garching.mpg.de
#
# Partition
#SBATCH --partition=p.24h
# Wall clock limit:
#SBATCH --time=00-23:59:58
# Load compiler and MPI modules with explicit version specifications,
# consistently with the versions used to build the executable.
echo "Working Directory = $(pwd)"
cd $SLURM_SUBMIT_DIR
export OUTPUT_LOC=$SLURM_SUBMIT_DIR/output-c100,m1.996,T4e4,t8.00,r2292.516
export PROG="./pluto"
# export ARGS="-catalyst 1 AllFieldsCatalyst.py"
# export ARGS="-maxsteps 500"
mkdir -p $OUTPUT_LOC/Log_Files
cp $SLURM_SUBMIT_DIR/definitions.h $OUTPUT_LOC
cp $SLURM_SUBMIT_DIR/pluto.ini $OUTPUT_LOC
cp $SLURM_SUBMIT_DIR/pluto $OUTPUT_LOC
module purge
module load gcc/10 openmpi/4 hdf5-mpi/1.12.0
export LD_LIBRARY_PATH="/mpcdf/soft/SLE_15/packages/skylake/openmpi/gcc_10-10.3.0/4.0.7/lib:/mpcdf/soft/SLE_15/packages/skylake/gsl/gcc_10-10.3.0/2.4/lib:/mpcdf/soft/SLE_15/packages/skylake/fftw/gcc_10-10.3.0-openmpi_4-4.0.7/3.3.10/lib:/mpcdf/soft/SLE_15/packages/skylake/hdf5/gcc_10-10.3.0-openmpi_4-4.0.7/1.12.0/lib:$LD_LIBRARY_PATH"
# export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/mpcdf/soft/SLE_15/packages/x86_64/paraview/5.10.1/lib/catalyst"
# export VTK_SILENCE_GET_VOID_POINTER_WARNINGS=1
srun $PROG $ARGS
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment