MPI Jobs

MPI (Message Passing Interface) utlizes node based parallelism, a MPI enabled code can use multiple CPU-cores on multiple nodes. Here is the C++ code we will be using:

#include <iostream>
#include <mpi.h>

int main(int argc, char** argv) {
  using namespace std;
  
  MPI_Init(&argc, &argv);

  int world_size, world_rank;
  MPI_Comm_size(MPI_COMM_WORLD, &world_size);
  MPI_Comm_rank(MPI_COMM_WORLD, &world_rank);

  // Get the name of the processor
  char processor_name[MPI_MAX_PROCESSOR_NAME];
  int name_len;
  MPI_Get_processor_name(processor_name, &name_len);

  // Print off a hello world message
  cout << "Process " << world_rank << " of " << world_size
       << " says hello from " << processor_name << endl;
  
  // uncomment next line to make CPU-cores work (infinitely)
  // while (true) {};

  MPI_Finalize();
  return 0;
}

Save the code as mpi.cpp.

Now compile it into a binary named mpi using the MPI compiler:

module load compilers/mpi/openmpi-slurm
mpicxx -o mpi mpi.cpp

We will now run the binary mpi using the following SLURM script

#!/bin/bash
#
#SBATCH --job-name="MPI Demo" 		# name of your job
#SBATCH --partition=peregrine-cpu	# partition to which job should be submitted
#SBATCH --qos=cpu_short				# qos type
#SBATCH --nodes=2                	# node count
#SBATCH --ntasks=16               	# total number of tasks across all nodes
#SBATCH --cpus-per-task=1        	# cpu-cores per task
#SBATCH --mem-per-cpu=1G         	# memory per cpu-core
#SBATCH --time=00:01:00          	# total run time limit (HH:MM:SS)
#
module purge
module load compilers/mpi/openmpi-slurm 

srun ./mpi

Submit the job as

sbatch mpi.sh

The result will be saved in a file named slurm-####.out and should look like

Process 15 of 16 says hello from peregrine1
Process 1 of 16 says hello from peregrine0
Process 2 of 16 says hello from peregrine0
Process 3 of 16 says hello from peregrine0
Process 4 of 16 says hello from peregrine0
Process 5 of 16 says hello from peregrine0
Process 6 of 16 says hello from peregrine0
Process 7 of 16 says hello from peregrine0
Process 8 of 16 says hello from peregrine0
Process 9 of 16 says hello from peregrine0
Process 10 of 16 says hello from peregrine0
Process 11 of 16 says hello from peregrine0
Process 12 of 16 says hello from peregrine0
Process 13 of 16 says hello from peregrine0
Process 0 of 16 says hello from peregrine0
Process 14 of 16 says hello from peregrine1