Going Parallel with MPI

Task parallelism: the work of a global problem can be divided into a number of independent tasks, which rarely need to synchronize. Monte Carlo simulations or numerical integration are examples of this.

MPI is a message-passing library where all the routines have corresponding C/C++-binding

   MPI_Command_name

and Fortran-binding (routine names are in uppercase, but can also be in lower case)

   MPI_COMMAND_NAME