Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] PathScale problems persist
From: Rafael Arco Arredondo (rafaarco_at_[hidden])
Date: 2010-09-21 07:31:34


In January, I reported a problem with Open MPI 1.4.1 and PathScale 3.2
about a simple Hello World that hung on initialization
( ).
Open MPI 1.4.2 does not show this problem.

However, now we are having trouble with the 1.4.2, PathScale 3.2, and
the C++ bindings. The following code:

#include <iostream>
#include <mpi.h>

int main(int argc, char* argv[]) {
  int node, size;

  MPI::Init(argc, argv);

  try {
    int rank = MPI::COMM_WORLD.Get_rank();
    int size = MPI::COMM_WORLD.Get_size();

    std::cout << "Hello world from process " << rank << " out of "
      << size << "!" << std::endl;

  catch(MPI::Exception e) {
    std::cerr << "MPI Error: " << e.Get_error_code()
      << " - " << e.Get_error_string() << std::endl;

  return 0;

generates the following output:

[host1:29934] *** An error occurred in MPI_Comm_set_errhandler
[host1:29934] *** on communicator MPI_COMM_WORLD
[host1:29934] *** MPI_ERR_COMM: invalid communicator
[host1:29934] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
mpirun has exited due to process rank 2 with PID 29934 on
node host1 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
[host1:29931] 3 more processes have sent help message
help-mpi-errors.txt / mpi_errors_are_fatal
[host1:29931] Set MCA parameter "orte_base_help_aggregate" to 0 to see
all help / error messages

There are no problems when Open MPI 1.4.2 is built with GCC (GCC 4.1.2).
No problems are found with Open MPI 1.2.6 and PathScale either.

Best regards,


Rafael Arco Arredondo
Centro de Servicios de Informática y Redes de Comunicaciones
Campus de Fuentenueva - Edificio Mecenas
Universidad de Granada