Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.



Column
Note
titleWork in Progress for Phase-2 Documentation

The content of this section is currently being updated to provide material relevant for Phase-2 of Setonix and the use of GPUs, which is expected to soon be available to Pawsey projects with Setonix GPU allocations.

Excerpt

This page is intended to help users of previous Pawsey GPU supercomputing infrastructure (such as Topaz) to transition to using the Setonix supercomputer.

...

Throughout this guide links are provided to relevant documentation pages in the general Supercomputing Documentation and to the Setonix User Guide, which provides documentation specifically for using GPUs the Setonix system. The Setonix GPU Partition Quick Start also provides specific details for using the GPUs in Setonix.

The This guide has been updated in preparation for the migration of GPU projects on Topaz to Setonix Phase 2.

...

Setonix Phase 2 GPUs replace Pawsey's previous generation of GPU infrastructure, specifically the Topaz GPU cluster and associated filesystems. This migration guide has been updated to outline changes for researchers transitioning from Topaz to Setonix Phase 12.

Significant changes to the GPU compute architecture include:

...

For containers, researchers can continue to use Singularity in a similar way to previous systems. Some system-wide installations (in particular, for bioinformatics) are now performed as container modules using SHPC: these softwares are installed as containers, but the user interface is the same as for compiled applications (load module, run executables).

There is a library of GPU-enabled containers that support the AMD MI250X GPUs available from the AMD Infinity Hub. Note that these containers may be limited in parallelism to one node, or one GPU, depending on the particular software.

Key changes to the software environment include:

...

For information specific to Setonix refer to the Running Jobs section, and particularly for GPU jobs the Example Slurm Batch Scripts for Setonix on CPU GPU Compute Nodes page of the Setonix User guide.

...