/
OpenFOAM: Advanced use of containerised modules and external containers

OpenFOAM: Advanced use of containerised modules and external containers

The following topics consider the most common situations on which users may need to explicitly invoke singularity (the container engine installed at Pawsey) for using OpenFOAM. If you are not familiar at all with containers, we recommend you to have a look to our specific documentation on that: Containers.

Explicit use of the singularity command and the .sif image

Previously, the only way of using an OpenFOAM containerised tool was to invoke that tool through the use of the singularity command and the name of the image. This procedure can still be used and is recommended for users that bring their own containerised openfoam installation. So, for example, if the user counts with a functional image named myopenfoam-8.sif, they could still load a singularity module with mpi capabilities and then use singularity commands to access/use the containerised solvers. For example, a quick test of the classical pimpleFoam solver:

Terminal 1. Explicit use of the singularity command with user's own container
$ module load singularity/<VERSION>-mpi
$ export SINGULARITY_CONTAINER=/PathToTheSingularityImage/myopenfoam-8.sif
$ singularity exec $SINGULARITY_CONTAINER pimpleFoam -help

Usage: pimpleFoam [OPTIONS]
options:
  -case <dir>       specify alternate case directory, default is the cwd
  -fileHandler <handler>
                    override the fileHandler
  -hostRoots <(((host1 dir1) .. (hostN dirN))>
                    slave root directories (per host) for distributed running
  -libs <(lib1 .. libN)>
                    pre-load libraries
...
  -srcDoc           display source code in browser
  -doc              display application documentation in browser
  -help             print the usage

Using: OpenFOAM-8 (see https://openfoam.org)
Build: 8-30b264cc33cd

(Note that the word PathToTheSingularityImage is only a placeholder for the real path to user's image.) Check our documentation about Singularity for further information about the use of containers.


As mentioned before, Pawsey provides OpenFOAM modules that make use of containerised versions of OpenFOAM on Setonix. One advantage of the use of these modules, as explained in the main page of the OpenFOAM documentation, is that the explicit use of the singularity command is not needed anymore. Nevertheless, users can still use the singularity command to access the corresponding images if they prefer. The path and name of the corresponding images is defined by default in the variable SINGULARITY_CONTAINER after loading the module. So, a similar example to the above would be:

Terminal 2. Explicit use of the singularity command with the SINGULARITY_CONTAINER variable
$ module load openfoam-org-container/7

$ echo $SINGULARITY_CONTAINER
/software/setonix/2022.05/containers/sif/quay.io/pawsey/openfoam-org/7/quay.io-pawsey-openfoam-org-7-sha256:3d427b3dec890193bb671185acefdc91fb126363b5f368d147603002b4708afe.sif

$ singularity exec $SINGULARITY_CONTAINER pimpleFoam -help
Usage: pimpleFoam [OPTIONS]
options:
  -case <dir>       specify alternate case directory, default is the cwd
  -fileHandler <handler>
                    override the fileHandler
  -hostRoots <(((host1 dir1) .. (hostN dirN))>
                    slave root directories (per host) for distributed running
  -libs <(lib1 .. libN)>
                    pre-load libraries
...
  -srcDoc           display source code in browser
  -doc              display application documentation in browser
  -help             print the usage

Using: OpenFOAM-7 (see https://openfoam.org)
Build: 7-63349425784a

Again, it's worth to remember that the singularity command syntax of the example above is not necessary when using the containerised-OpenFOAM modules offered at Pawsey. For the containerised modules offered by Pawsey, the names of the openfoam commands/solvers are indeed wrappers that call the containerised tools as if they were bare metal installations. Therefore, the simple use of pimpleFoam -help would have been enough for the example above instead of the full singularity syntax, as explained in the main page of the OpenFOAM documentation. Also note that, when using the containerised modules, there is no need to explicitly load singularity as it is loaded by default together with the OpenFOAM module.

Nevertheless, there are some cases for which users may prefer to explicitly use the singularity command. For example, to query the content of an environment variable defined within the container (like FOAM_ETC) they can use:

Terminal 3. Explicit use of the singularity command and the SINGULARITY_CONTAINER variable to query for an environment variable from the container
$ module load singularity/<VERSION>-mpi
$ export SINGULARITY_CONTAINER=/PathToTheSingularityImage/myopenfoam-8.sif
$ singularity exec $SINGULARITY_CONTAINER printenv | grep FOAM_ETC
FOAM_ETC=/opt/OpenFOAM/OpenFOAM-8/etc
$

(Note that in this example, an imaged owned by the user is being used.)

Or, if the user wishes to open an interactive session within the container:

Terminal 4. Explicit use of the singularity command and the SINGULARITY_CONTAINER variable to open an interactive session
$ module load openfoam-container/v2012
$ singularity shell $SINGULARITY_CONTAINER
Singularity> echo $FOAM_ETC
/opt/OpenFOAM/OpenFOAM-v2012/etc
Singularity>

(Note that in this example, the image provided by the containerised module is being used.)

And, of course, the singularity command can be used within Slurm batch scripts. So, the execution command in the example script for the solver execution in the OpenFOAM: Example Slurm Batch Scripts page can be modified to explicitly use the singularity command:


Listing 1. Example Slurm batch script to run a solver with 1152 mpi tasks
#!/bin/bash --login
 
#SBATCH --job-name=[name_of_job]
#SBATCH --partition=work
#SBATCH --ntasks=1152
#SBATCH --ntasks-per-node=128
#SBATCH --cpus-per-task=1
#SBATCH --exclusive
#SBATCH --time=[neededTime]

# --- Load modules and define images:
# -Using the containerised container:
module load openfoam-org-container/7
# -Using user's own image:
#module load singularity/<VERSION>-mpi  #Adapt <version> to the current provided version of singularity
#SINGULARITY_CONTAINER="/PathToTheSingularityImage/myopenfoam-8.sif"  #Adapt path and name to the correct ones

#--- Specific settings for the cluster you are on
#(Check the specific guide of the cluster for additional settings)

# ---
# Set MPI related environment variables. Not all need to be set
# main variables for multi-node jobs (uncomment for multinode jobs)
export MPICH_OFI_STARTUP_CONNECT=1
export MPICH_OFI_VERBOSE=1
#Ask MPI to provide useful runtime information (uncomment if debugging)
#export MPICH_ENV_DISPLAY=1
#export MPICH_MEMORY_REPORT=1


#--- Automating the list of IORANKS for collated fileHandler
echo "Setting the grouping ratio for collated fileHandling"
nProcs=$SLURM_NTASKS #Number of total processors in decomposition for this case
mGroup=32            #Size of the groups for collated fileHandling (32 is the initial recommendation for Setonix)
of_ioRanks="0"
iC=$mGroup
while [ $iC -le $nProcs ]; do
   of_ioRanks="$of_ioRanks $iC"
   ((iC += $mGroup))
done
export FOAM_IORANKS="("${of_ioRanks}")"
echo "FOAM_IORANKS=$FOAM_IORANKS"

#-- Execute the solver:
srun -N $SLURM_JOB_NUM_NODES -n $SLURM_NTASKS -c 1 \
     singularity exec $SINGULARITY_CONTAINER pimpleFoam -parallel

(For the use of their own image, users should comment the line that loads the containerised module and uncomment the lines that load the singularity module and define the SINGULARITY_CONTAINER variable as the real path to their own image. Obviously, the <VERSION> and the real path should be adapted.)

Wrappers of the shell and exec commands

The installed modules provide two additional wrappers that can be used to avoid the explicit call of the singularity command when needing to use the exec or the shell sub-commands (as in the last two examples of the section above). These two wrappers are (depending on the -org flavour or not of openfoam):

openfoam-exec     or     openfoam-org-exec

and

openfoam-shell     or     openfoam-org-shell

So, the last two examples of the section above can be achieved with the use of these wrappers as:

Terminal 5. Use of the exec wrapper
$ module load openfoam-org-container/8
$ openfoam-org-exec printenv | grep FOAM_ETC
FOAM_ETC=/opt/OpenFOAM/OpenFOAM-8/etc
$

and:

Terminal 6. Use of the shell wrapper
$ module load openfoam-container/v2012
$ openfoam-shell
Singularity> echo $FOAM_ETC
/opt/OpenFOAM/OpenFOAM-v2012/etc
Singularity>

Wrappers of the solvers and tools for the installed modules

As explained in the main OpenFOAM documentation page, the tools and solvers within the installed modules are directly accessible without the need to explicitly call the singularity command. So, for example, after loading the module for openfoam-container/v2012, the following three commands are equivalent:

pimpleFoam -help

or

openfoam-exec pimpleFoam -help

or

singularity exec $SINGULARITY_CONTAINER pimpleFoam -help

(If the flavour of the loaded module was of the -org type, then the second command would be openfoam-org-exec pimpleFoam -help).

So, indeed, after loading one of the available containerised modules, the names of the tools/solvers are recognised but are indeed wrappers that invoke both the singularity image (with the singularity command) and the real containerised tool that exists within. So indeed the pimpleFoam in the first line is a wrapper for the full command written in the third line of the example above.

Working with tutorials

Pawsey containers have been installed preserving the tutorials provided by OpenFOAM developers. These tutorials are accessible at the path given by the environmental variable FOAM_TUTORIALS, but this variable exist only inside the container. Therefore its evaluation needs to be interpreted by the container and not by the host. For that, the bash -c command is handy. For example, when channel395 tutorial is the case a user wants to work with, they can find its path inside the container and then make a copy into their working directory in the host:

Terminal 7. Use of the exec wrapper
$ module load openfoam-org-container/8
$ openfoam-org-exec bash -c 'find $FOAM_TUTORIALS -iname "*channel*"'
/opt/OpenFOAM/OpenFOAM-8/tutorials/compressible/rhoPimpleFoam/laminar/blockedChannel
/opt/OpenFOAM/OpenFOAM-8/tutorials/incompressible/pimpleFoam/LES/channel395
/opt/OpenFOAM/OpenFOAM-8/tutorials/incompressible/pimpleFoam/LES/channel395/system/postChannelDict
/opt/OpenFOAM/OpenFOAM-8/tutorials/incompressible/pimpleFoam/laminar/blockedChannel
/opt/OpenFOAM/OpenFOAM-8/tutorials/lagrangian/MPPICFoam/injectionChannel
/opt/OpenFOAM/OpenFOAM-8/tutorials/lagrangian/reactingParcelFoam/verticalChannel
/opt/OpenFOAM/OpenFOAM-8/tutorials/lagrangian/reactingParcelFoam/verticalChannelLTS
/opt/OpenFOAM/OpenFOAM-8/tutorials/lagrangian/simpleReactingParcelFoam/verticalChannel
/opt/OpenFOAM/OpenFOAM-8/tutorials/multiphase/interFoam/RAS/waterChannel

$ openfoam-org-exec cp -r /opt/OpenFOAM/OpenFOAM-8/tutorials/incompressible/pimpleFoam/LES/channel395 .
$ ls
channel395

Or users can start an interactive session to search and copy the required tutorial:

Terminal 8. Use of the shell wrapper
$ module load openfoam-org-container/8
$ openfoam-org-shell

Singularity> HOSTDIR=$PWD
Singularity> cd $FOAM_TUTORIALS
Singularity> cd incompressible/pimpleFoam/LES/
Singularity> ls
channel395
Singularity> cp -r channel395/ $HOSTDIR
Singularity> ls $HOSTDIR
channel395
Singularity> exit

$ ls
channel395

Adapt the tutorial to best practices

Before executing a tutorial in Pawsey systems, always adapt the default dictionaries to comply with the OpenFOAM: Best Practices, so you will need to change the writeFormat, purgeWrite and runTimeModifiable variables among others. Also notice that by default, all provided modules at Pawsey with make use of the collated file handler.

Compiling your own tools

OpenFOAM users often have the need to compile their own solvers/tools. With the use of containers there are two routes to follow: 1) Develop and compile additional solvers/tools outside the existing container and 2) Build a new image with the additional tools compiled inside of it.

Both routes have their pros and cons, but we recommend to use the first route for the development phase of the tools/solvers in order to avoid rebuilding of an image for every step on the development. Instead, the additional tools/solvers can be developed on the host and compiled with the OpenFOAM machinery of the container but keeping the source files and executables in the host file system.

We recommend the second route for additional tools/solvers that are not in development anymore and are therefore candidates to exist inside an additional container image.

Developing and compiling outside the container

In a typical OpenFOAM installation, the environmental variable that defines the path where user's own binaries and libraries are to be stored is WM_PROJECT_USER_DIR. But when dealing with the OpenFOAM containers prepared at Pawsey, that variable has been been already defined to a path internal to the container and which can't be modified, as containers own directories are non-writable. Nevertheless, users can still compile their own tools or solvers and store them in a directory in the host filesystem. In order for this to work, we recommend to bind the path in the host where the compiled tools will be saved to the internal path indicated by WM_PROJECT_USER_DIR. In this way, the container will look for the tools in the path indicated by the mentioned variable, but in practise it will be accessing the host directory that has been bound to the internal path.

1-. The first step to complete this procedure is to know the value of WM_PROJECT_USER_DIR inside the container, for that we can do:

Terminal 9. Print the value of WM_PROJECT_USER_DIR
$ module load openfoam-org-container/8
$ singularity exec $SINGULARITY_CONTAINER bash -c 'echo $WM_PROJECT_USER_DIR'
/home/ofuser/OpenFOAM/ofuser-8

(Here, a specific flavour/version of OpenFOAM module is being used as example, but the procedure applies to any other container. Note that for our containerised modules, the singularity image is accessible through the variable SINGULARITY_CONTAINER, but for other images you may need to call them explicitly.)


2-. Save the path of the WM_PROJECT_USER_DIR internal variable into an auxiliary variable, which will be used later in the binding step of the procedure:

Terminal 10. Save the value in an auxiliary variable
$ wmpudInside=$(singularity exec $SINGULARITY_CONTAINER bash -c 'echo $WM_PROJECT_USER_DIR')
$ echo $wmpudInside
/home/ofuser/OpenFOAM/ofuser-8


3-. Have a directory in the host where are you going to save/develop your own tools/solvers. Then put your source files into that directory: