This should also be useful for other clusters where you want to use
components (e.g. MPI, compilers) from the module system.
Start a session for building
si -N 1 -n 16 -c 1 -t 0-02:00:00 # on iris: -C broadwell or -C skylake
Clone and setup spack in $HOME - it has better much better performance for
small files than $SCRATCH
cd $HOME
git clone --depth=2 https://github.com/spack/spack.git
cd spack
Setup Spack in the environment
source $HOME/spack/share/spack/setup-env.sh
TODO: Add instructions on how to get spack-packages on develop branch - default branch releases/v2025.07 is broken.
Then create $HOME/.spack/packages.yaml
touch $HOME/.spack/packages.yaml
with the following contents:
packages:
gcc:
externals:
- spec: gcc@13.2.0+binutils languages:='c,c++,fortran'
modules:
- compiler/GCC/13.2.0
extra_attributes:
compilers:
c: /opt/apps/easybuild/systems/aion/rhel810-20251006/2023b/epyc/software/GCCcore/13.2.0/bin/gcc
cxx: /opt/apps/easybuild/systems/aion/rhel810-20251006/2023b/epyc/software/GCCcore/13.2.0/bin/g++
fortran: /opt/apps/easybuild/systems/aion/rhel810-20251006/2023b/epyc/software/GCCcore/13.2.0/bin/gfortran
buildable: false
binutils:
externals:
- spec: binutils@2.40
modules:
- tools/binutils/2.40-GCCcore-13.2.0
buildable: false
libevent:
externals:
- spec: libevent@2.1.12
modules:
- lib/libevent/2.1.12-GCCcore-13.2.0
buildable: false
libfabric:
externals:
- spec: libfabric@1.19.0
modules:
- lib/libfabric/1.19.0-GCCcore-13.2.0
buildable: false
libpciaccess:
externals:
- spec: libpciaccess@0.17
modules:
- system/libpciaccess/0.17-GCCcore-13.2.0
buildable: false
libxml2:
externals:
- spec: libxml2@2.11.5
modules:
- lib/libxml2/2.11.5-GCCcore-13.2.0
buildable: false
hwloc:
externals:
- spec: hwloc@2.9.2+libxml2
modules:
- system/hwloc/2.9.2-GCCcore-13.2.0
buildable: false
mpi:
buildable: false
munge:
externals:
- spec: munge@0.5.13
prefix: /usr
buildable: false
numactl:
externals:
- spec: numactl@2.0.16
modules:
- tools/numactl/2.0.16-GCCcore-13.2.0
buildable: false
openmpi:
variants: fabrics=ofi,ucx schedulers=slurm
externals:
- spec: openmpi@4.1.6
modules:
- mpi/OpenMPI/4.1.6-GCC-13.2.0
buildable: false
pmix:
externals:
- spec: pmix@4.2.6
modules:
- lib/PMIx/4.2.6-GCCcore-13.2.0
buildable: false
slurm:
externals:
- spec: slurm@23.11.10 sysconfdir=/etc/slurm
prefix: /usr
buildable: false
ucx:
externals:
- spec: ucx@1.15.0
modules:
- lib/UCX/1.15.0-GCCcore-13.2.0
buildable: false
zlib:
externals:
- spec: zlib@1.2.13
modules:
- lib/zlib/1.2.13-GCCcore-13.2.0
buildable: false
This tells Spack to use the system available GCC, binutils and OpenMPI with the native fabrics.
Create an environment and install FEniCS
cd ~
spack env create -d fenicsx-main-20230126/
spack env activate fenicsx-main-20230126/
spack add py-fenics-dolfinx@main fenics-dolfinx+adios2 adios2+python petsc+mumps
# Change @main to e.g. @0.7.2 in the above if you want a fixed version.
spack concretize
spack install -j16
or the same directly in spack.yaml in $SPACK_ENV
spack:
# add package specs to the `specs` list
specs:
- py-fenics-dolfinx@main+petsc4py
- fenics-dolfinx@main+adios2
- petsc+mumps
- adios2+python
view: true
concretizer:
unify: true
The following are also commonly used in FEniCS scripts and may be useful
spack add gmsh+opencascade py-gmsh py-numba py-scipy py-matplotlib
It is possible to build a specific version (git ref) of DOLFINx. Note that the hash must be the full hash. It is best to specify appropriate git refs on all components.
# This is a Spack Environment file.
#
# It describes a set of packages to be installed, along with
# configuration settings.
spack:
# add package specs to the `specs` list
specs:
- fenics-dolfinx@git.4f575964c70efd02dca92f2cf10c125071b17e4d=main+adios2
- py-fenics-dolfinx@git.4f575964c70efd02dca92f2cf10c125071b17e4d=main+petsc4py
- py-fenics-basix@git.2e2a7048ea5f4255c22af18af3b828036f1c8b50=main
- fenics-basix@git.2e2a7048ea5f4255c22af18af3b828036f1c8b50=main
- py-fenics-ufl@git.b15d8d3fdfea5ad6fe78531ec4ce6059cafeaa89=main
- py-fenics-ffcx@git.7bc8be738997e7ce68ef0f406eab63c00d467092=main
- fenics-ufcx@git.7bc8be738997e7ce68ef0f406eab63c00d467092=main
- petsc+mumps
- adios2+python
view: true
concretizer:
unify: true
It is also possible to build only the C++ layer using
spack add fenics-dolfinx@main+adios2+petsc py-fenics-ffcx@main petsc+mumps
To rebuild FEniCSx from main branches inside an existing environment
spack install --overwrite -j16 fenics-basix py-fenics-basix py-fenics-ffcx fenics-ufcx py-fenics-ufl fenics-dolfinx py-fenics-dolfinx
Quickly test the build with
srun python -c "from mpi4py import MPI; import dolfinx"
See the uni.lu documentation for full details - using the environment should be as
simple as adding the following where ... is the name/folder of your environment.
#!/bin/bash -l
source $HOME/spack/share/spack/setup-env.sh
spack env activate ...
module load tools/binutils/2.40-GCCcore-13.2.0 # needed at runtime for cffi linking to work
This is experimental.
Place in ~/spack/modules.yaml:
modules:
# This maps paths in the package install prefix to environment variables
# they should be added to. For example, <prefix>/bin should be in PATH.
prefix_inspections:
./bin:
- PATH
./man:
- MANPATH
./share/man:
- MANPATH
./share/aclocal:
- ACLOCAL_PATH
./lib/pkgconfig:
- PKG_CONFIG_PATH
./lib64/pkgconfig:
- PKG_CONFIG_PATH
./share/pkgconfig:
- PKG_CONFIG_PATH
./:
- CMAKE_PREFIX_PATH
# These are configurations for the module set named "default"
default:
# Where to install modules
roots:
lmod: $spack/share/spack/lmod
# What type of modules to use ("tcl" and/or "lmod")
enable: ["lmod"]
# Default configurations if lmod is enabled
lmod:
all:
autoload: direct
core_compilers:
- gcc@13.2.0
hierarchy:
- mpi
Then run:
spack module lmod refresh
and then update MODULEPATH:
export MODULEPATH=~/spack/share/spack/lmod/linux-rhel8-x86_64:$MODULEPATH
You should now be able to find modules:
module spider py-fenics-dolfinx
And then repeat to get detailed instructions.
For example, on my install, I can load py-fenics-dolfinx with:
module load Core/gcc/13.2.0-vrrp4ic openmpi/4.1.6-kejqrjg py-fenics-dolfinx/0.9.0-m7pjnvk
Note that once this is setup, it is not necessary to source spack into the environment at all.
import adios2 does not work in Python due to a runtime linking error with GLIBCXX_3.4.32.