Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: Kobotov, Alexander V (alexander.v.kobotov_at_[hidden])
Date: 2007-02-20 04:25:08


Hello all,

 

I built BLACS on Itanium using Intel compilers under linux
(2.6.9-34.EL). But it fails default BLACS Fortran tests (xFbtest), C
tests (xCbtest) are ok. I've tried different configurations combining
OpenMPI-1.1.2 or OpenMPI-1.1.4, ICC 9.1.038 or ICC 8.1.38, IFORT 9.1.33
or IFORT 8.1.34, but all results were the same. OpenMPI is built using
9.1 compilers. Also I've tried the same using em64t compiler on Intel
XEON - all tests were passed. MPICH2 on IPF also works fine.

 

Is that an OpenMPI bug? Maybe some workaround exists?

 

Bmake.inc is attached.

Below is output I've got (Don't pay attention to blacs warnings, they
are normal for MPI):

===[ begin of: xFbtest output ]=====================================

-bash-3.00$ mpirun -np 4 xFbtest_MPI-LINUX-0

BLACS WARNING 'No need to set message ID range due to MPI communicator.'

from {-1,-1}, pnum=1, Contxt=-1, on line 18 of file 'blacs_set_.c'.

 

BLACS WARNING 'No need to set message ID range due to MPI communicator.'

from {-1,-1}, pnum=3, Contxt=-1, on line 18 of file 'blacs_set_.c'.

 

BLACS WARNING 'No need to set message ID range due to MPI communicator.'

from {-1,-1}, pnum=0, Contxt=-1, on line 18 of file 'blacs_set_.c'.

 

BLACS WARNING 'No need to set message ID range due to MPI communicator.'

from {-1,-1}, pnum=2, Contxt=-1, on line 18 of file 'blacs_set_.c'.

 

[comp-pvfs-0-7.local:30119] *** An error occurred in MPI_Comm_group

[comp-pvfs-0-7.local:30118] *** An error occurred in MPI_Comm_group

[comp-pvfs-0-7.local:30118] *** on communicator MPI_COMM_WORLD

[comp-pvfs-0-7.local:30118] *** MPI_ERR_COMM: invalid communicator

[comp-pvfs-0-7.local:30119] *** on communicator MPI_COMM_WORLD

[comp-pvfs-0-7.local:30119] *** MPI_ERR_COMM: invalid communicator

[comp-pvfs-0-7.local:30119] *** MPI_ERRORS_ARE_FATAL (goodbye)

[comp-pvfs-0-7.local:30116] *** An error occurred in MPI_Comm_group

[comp-pvfs-0-7.local:30116] *** on communicator MPI_COMM_WORLD

[comp-pvfs-0-7.local:30118] *** MPI_ERRORS_ARE_FATAL (goodbye)

[comp-pvfs-0-7.local:30116] *** MPI_ERR_COMM: invalid communicator

[comp-pvfs-0-7.local:30116] *** MPI_ERRORS_ARE_FATAL (goodbye)

[comp-pvfs-0-7.local:30117] *** An error occurred in MPI_Comm_group

[comp-pvfs-0-7.local:30117] *** on communicator MPI_COMM_WORLD

[comp-pvfs-0-7.local:30117] *** MPI_ERR_COMM: invalid communicator

[comp-pvfs-0-7.local:30117] *** MPI_ERRORS_ARE_FATAL (goodbye)

forrtl: error (78): process killed (SIGTERM)

forrtl: error (78): process killed (SIGTERM)

forrtl: error (78): process killed (SIGTERM)

forrtl: error (78): process killed (SIGTERM)

===[ end of: xFbtest output ]=====================================

 

 

W.B.R.,

Kobotov Alexander

 

 




  • application/octet-stream attachment: Bmake.inc