...
(Note again that <projectName1234>
and <userName>
are just placeholders for the expected output of the example. Also note that for this example we are using openfoam-container/8
, so the number "8" is related to the version of the example and should be changed for the version of use in the real case.)
...
Column |
---|
|
Code Block |
---|
language | bash |
---|
theme | DJango |
---|
title | Terminal 14.a Check for the compiled tools in the auxiliary path in the host |
---|
| $ ls $wmpudOutside
platforms src
$ ls $wmpudOutside/platforms
linux64GccDPInt32Opt
$ ls $wmpudOutside/platforms/linux64GccDPInt32Opt
bin lib
$ ls $wmpudOutside/platforms/linux64GccDPInt32Opt/bin
yourSolverFOAMyourSolverFoam
|
or Code Block |
---|
language | bash |
---|
theme | DJango |
---|
title | Terminal 14.b Check for the compiled tools using FOAM_USER_APPBIN |
---|
| $ singularity exec -B $wmpudOutside:$wmpudInside $SINGULARITY_CONTAINER bash -c 'ls $FOAM_USER_APPBIN'
yourSolverFOAMyourSolverFoam |
|
(Note that yourSolverFoam
is just an example name.)
7-. To execute user's own compiled tool/solver, users should always keep binding the path in the host (wmpudOutside
) to the path in the container (wmpudInside
) in the singularity command by adding "-B $wmpudOutside:$wmpudInside
" to the usual Singularity+OpenFOAM commands for interactive sessions or within Slurm batch scripts. Of course, these variables need to always be defined before the singularity
command is used to call the tool/solver in a Slurm batch script or in an interactive session. For example, a simple test of the user's own solver in an interactive session would be:
Column |
---|
|
Code Block |
---|
language | bash |
---|
theme | DJango |
---|
title | Terminal 15. To use your own tool you need to bind the path again |
---|
| $ module load openfoam-org-container/8
$ wmpudInside=$(singularity exec $SINGULARITY_CONTAINER bash -c 'echo $WM_PROJECT_USER_DIR')
$ wmpudOutside=$MYSOFTWARE/OpenFOAM/$USER-8
$ singularity exec -B $wmpudOutside:$wmpudInside $SINGULARITY_CONTAINER yourSolverFOAMyourSolverFoam -help |
|
(Note again that <yourSolverFOAM>
is just a placeholder for the example.)
...
Usage: yourSolverFoam [OPTIONS]
options:
-case <dir> specify alternate case directory, default is the cwd
-fileHandler <handler>
override the fileHandler
-hostRoots <(((host1 dir1) .. (hostN dirN))>
slave root directories (per host) for distributed running
-libs <(lib1 .. libN)>
pre-load libraries
...
-srcDoc display source code in browser
-doc display application documentation in browser
-help print the usage |
|
(Note that yourSolverFoam
is just an example name.)
8-. The use within a Slurm batch script needs to follow the same principles. For example, an adaptation of the example script for solver execution in the OpenFOAM: Example Slurm Batch Scripts would be:
Column |
---|
|
Code Block |
---|
language | bash |
---|
theme | Emacs |
---|
title | Listing 2. Example Slurm batch script to run user's own solver with 1152 mpi tasks |
---|
| #!/bin/bash --login
#SBATCH --job-name=[name_of_job]
#SBATCH --partition=work
#SBATCH --ntasks=1152
#SBATCH --ntasks-per-node=128
#SBATCH --cpus-per-task=1
#SBATCH --exclusive
#SBATCH --time=[neededTime]
module load openfoam-org-container/8
#--- Specific settings for the cluster you are on
#(Check the specific guide of the cluster for additional settings)
# ---
# Set MPI related environment variables. Not all need to be set
# main variables for multi-node jobs (uncomment for multinode jobs)
export MPICH_OFI_STARTUP_CONNECT=1
export MPICH_OFI_VERBOSE=1
#Ask MPI to provide useful runtime information (uncomment if debugging)
#export MPICH_ENV_DISPLAY=1
#export MPICH_MEMORY_REPORT=1
#--- Automating the list of IORANKS for collated fileHandler
echo "Setting the grouping ratio for collated fileHandling"
nProcs=$SLURM_NTASKS #Number of total processors in decomposition for this case
mGroup=32 #Size of the groups for collated fileHandling (32 is the initial recommendation for Setonix)
of_ioRanks="0"
iC=$mGroup
while [ $iC -le $nProcs ]; do
of_ioRanks="$of_ioRanks $iC"
((iC += $mGroup))
done
export FOAM_IORANKS="("${of_ioRanks}")"
echo "FOAM_IORANKS=$FOAM_IORANKS"
#-- Defining the binding paths:
wmpudInside=$(singularity exec $SINGULARITY_CONTAINER bash -c 'echo $WM_PROJECT_USER_DIR')
wmpudOutside=$MYSOFTWARE/OpenFOAM/$USER-8
#-- Execute user's own solver:
srun -N $SLURM_JOB_NUM_NODES -n $SLURM_NTASKS -c $SLURM_CPUS_PER_TASK yourSolverFOAM -parallel |
|
Building a new image with compiled additional tools/solvers
...