Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: Wael Sinno (wael.sinno_at_[hidden])
Date: 2006-09-13 20:20:28


Hi,

I have an application that uses the UnixODBC library
(http://www.unixodbc.org) and MPI. When trying to run a program linked with
UnixODBC, I immediately get an error, regardless of the calls in the
program, i.e. OpenMPI fails during MPI_Init, which is the first call in the
program.

I tried a simple experiment using the following program (trivial to
demonstrate the bug) :

#include <mpi.h>
#include <iostream>

int main(int argc, char* argv[])
{
   std::cerr << "Initializing MPI" << std::endl;
   MPI_Init(&argc, &argv);
   std::cerr << "MPI Initialized" << std::endl;

   int rank;
   MPI_Comm_rank(MPI_COMM_WORLD, &rank);
   std::cerr << "My rank is : " << rank << std::endl;

   std::cerr << "Shutting down MPI" << std::endl;
   MPI_Finalize();
}

If I compile this normally without UnixODBC, everything is fine:

[wsinno_at_cluster openmpi_bug]$ mpic++ main.cpp
[wsinno_at_cluster openmpi_bug]$ mpiexec -n 2 ./a.out
Initializing MPI
Initializing MPI
MPI Initialized
My rank is : 0
Shutting down MPI
MPI Initialized
My rank is : 1
Shutting down MPI

If I compile and link in UnixODBC, I get the following problem:

[wsinno_at_cluster openmpi_bug]$ mpic++ main.cpp -L UnixODBC/lib -lodbc
[wsinno_at_cluster openmpi_bug]$ mpiexec -n 2 ./a.out
Initializing MPI
[cluster.logicblox.local:02272] [NO-NAME] ORTE_ERROR_LOG: Not found in file
runtime/orte_init_stage1.c at line 214
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  orte_sds_base_select failed
  --> Returned value -13 instead of ORTE_SUCCESS

--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems. This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: orte_init_stage1 failed
  --> Returned "Not found" (-13) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (goodbye)

I have tried using iodbc (http://www.iodbc.org) instead, and that seems to
work fine. Attached are the config.log and ompi_info output.

Wael.