Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] openmpi-1.4.1 Debian-Live Cd
From: piening (piening_at_[hidden])
Date: 2010-03-10 01:37:32


Hi,

I set up a Linux Cluster with differnt Distributions ( 1x Debian Lenny,
4x OpenSuse11.2 ) and openmpi-1.4.1 , all my test applications ran perfekt.

Now I decided to create a Debian-Live System (Lenny) with openmpi-1.4.1,
to include some more Pc's in our Student-Pool, and always get the
folowing errors:

#: mpirun --hostfile my_hostfile -np 4 hello_c

Hello, world, I am 2 of 4 Dell-19 (256)
Hello, world, I am 0 of 4 Dell-19 (256)
Hello, world, I am 3 of 4 Dlive (256)
Hello, world, I am 1 of 4 Dlive (256)
[Dell-19:9199] *** An error occurred in MPI_Barrier
[Dell-19:9199] *** on communicator MPI_COMM_WORLD
[Dell-19:9199] *** MPI_ERR_IN_STATUS: error code in status
[Dell-19:9199] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 9199 on
node Dell-19 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[Dell-19:09196] 1 more process has sent help message help-mpi-errors.txt
/ mpi_errors_are_fatal
[Dell-19:09196] Set MCA parameter "orte_base_help_aggregate" to 0 to see
all help / error messages

I've got no more idea how to fix this.

Thanks in advance

        horst.

-- 
 Horst Piening
 IT-Administrator Fb15
 Universitaet Hamburg
 Bundesstr. 55
 20146 Hamburg
 Email: piening_at_[hidden]
 Tel.: (040) 42838-7015