Skip to content Skip to main navigation Report an accessibility issue
High Performance & Scientific Computing

Available Software on ISAAC-Legacy



Available Software

ISAAC has an extensive library of software available to use. Please see the output of the “module avail” command for a more complete list of software installed on the system. To learn more, or to request a specific software package be installed, please click on Submit HPSC Service Request in the left side menu of this page.

Default Software and Environments

  • Git and GitHub
  • Julia
  • Jupyter Notebooks
  • Machine Learning frameworks (Tensorflow, PyTorch, Keras, OpenCV)
  • R and Python
  • SAS and SAS/CONNECT
  • Software environments via Conda
  • Matlab

How to load and configure the modules

All Available Modules

To list all available modules, run:

module avail

Search for modules

You can search for modules or extensions with spider and avail. For example, to find and list all Python version 3 modules, run:

module avail python/3

To find any module or extension that mentions python in its name or description, use the command:

module spider python

Load and Unload Module

Load

The module load command modifies your environment so you can use the specified software package(s).

  • This command is case-sensitive to module names.
  • The module load command will load dependencies as needed, you don’t need to load them separately.
  • For batch jobs, add module load command(s) to your submission script.

For example, to load Python version 3.8.6 and BLAST+ version 2.11.0, find modules with matching toolchain suffixes and run the command:

module load Python/3.8.6-GCCcore-10.2.0 BLAST+/2.11.0-GCCcore-10.2.0

Lmod will add python and the BLAST commands to your environment. Since both of these modules were built with the GCCcore/10.2.0 toolchain module, they will not load conflicting libraries. Recall you can see the other modules that were loaded by running module list.

Module Defaults

As new versions of software get installed and others are deprecated, the default module version can change over time. It is best practice to note the specific module versions you are using for a project and load those explicitly, e.g. module load Python/3.8.6-GCCcore-10.2.0 not module load Python. This makes your work more reproducible and less likely to change unexpectedly in the future.

Unload

You can also unload a specific module that you’ve previously loaded:

module unload R

Or unload all modules at once with:

module purge

Purge Lightly

module purge will alert you to a sticky module that is always loaded called StdEnv. Avoid unloading StdEnv unless explicitly told to do so, othewise you will lose some important setup for the cluster you are on.

Get Module Help

You can get a brief description of a module and the url to the software’s homepage by running: module help modulename/version

Module hierarchy

Libraries built with one compiler need to be linked with applications built with the same compiler version. For High Performance Computing there are libraries called Message Passing Interface (MPI) that allow for efficient communication between tasks on a distributed memory computer with many processors. Parallel libraries and applications must be built with a matching MPI library and compiler. To make this discussion clearer, suppose we have the intel compiler version 15.0.1 and the gnu compiler collection version 4.9.2. Also we have two MPI libraries: mpich version 3.1.2 and openmpi version 1.8.2. Finally we have a parallel library HDF5 version 1.8.13 (phdf5).

Of the many possible ways of specifying a module layout, this flat layout of modules is a reasonable way to do this:

$ module avail

--------------- /opt/apps/modulefiles ----------------------
gcc/4.9.2                        phdf5/gcc-4.9-mpich-3.1-1.8.13
intel/15.0.2                     phdf5/gcc-4.9-openmpi-15.0-1.8.13
mpich/gcc-4.9-3.1.2              phdf5/intel-15.0-mpich-3.1-1.8.13
mpich/intel-15.0-3.1.2           phdf5/intel-15.0-openmpi-15.0-1.8.13
openmpi/gcc-4.9-1.8.2
openmpi/intel-15.0-1.8.2

In order for users to load the matching set of compilers and MPI libraries, they will have to load the matching set of modules. For example this would be correct:

$ module load gcc/4.9.2 openmpi/gcc-4.9-1.8.2  phdf5/gcc-4.9-openmpi-15.0-1.8.13

Using Modules In Job Submissions

There are two ways to use software modules for your job submissions:

  1. Load software modules before submitting jobs, as the software environment is inherited by the job. This is great when using interactive shells for development, one-off’s, and exploratory work. This method is the only method for command-line wrapper scripts and may work for some wrappers (e.g. MATLAB), but not others (e.g. R, Python). Test carefully before using.
  2. Load software modules in your job scripts. This method is preferred for batch jobs and using submission scripts for your work, and is a good research data management practice, but can be cumbersome.

Note that whatever method you choose, once a module is loaded in your session or inside your script, it is available just as though it had always been there.

Further Reading

You can view documentation while on the cluster using the command:

man module