Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: [OMPI users] Problem with gateway between 2 hosts
From: Geoffroy Pignot (geopignot_at_[hidden])
Date: 2008-06-30 11:29:55


Hi,

Does anybody face problems running Openmpi on two hosts with different
networks (gateway to reach the other) ?
Let say compil02 ip adress is 172.3.9.10 and r009n001 is 10.160.4.1

There is no problem with MPI_init free executables (for example hostname)

compil02% /tmp/HALMPI/openmpi-1.2.2/bin/mpirun --prefix
/tmp/HALMPI/openmpi-1.2.2 -np 1 -host compil02 hostname : -np 1 -host
r009n001 hostname
r009n001
compil02

But as soon as I try a simple hello world , it 's crashing with the
following error message.
Please note that when I try to run hello between r009n001 (10.160.4.1) and
r009n002 (10.160.4.2), it works fine

Thanks in advance for your help.
Regards

Geoffroy

PS: same error with openmpi v1.2.5

compil02% /tmp/HALMPI/openmpi-1.2.2/bin/mpirun --prefix
/tmp/HALMPI/openmpi-1.2.2 -np 1 -host compil02 /tmp/hello : -np 1 -host
r009n001 /tmp/hello
--------------------------------------------------------------------------
Process 0.1.0 is unable to reach 0.1.1 for MPI communication.
If you specified the use of a BTL component, you may have
forgotten a component (such as "self") in the list of
usable components.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems. This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  PML add procs failed
  --> Returned "Unreachable" (-12) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Process 0.1.1 is unable to reach 0.1.0 for MPI communication.
If you specified the use of a BTL component, you may have
forgotten a component (such as "self") in the list of
usable components.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems. This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  PML add procs failed
  --> Returned "Unreachable" (-12) instead of "Success" (0)
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (goodbye)
*** An error occurred in MPI_Init
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (goodbye)