Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2007-03-06 10:12:10


Sorry for the delay in replying -- we've been quite busy trying to
get OMPI v1.2 out the door!

Are you sure that you build BLACS properly with Open MPI? Check this
FAQ item:

    http://www.open-mpi.org/faq/?category=mpi-apps#blacs

In particular, note that there are items in Bmake.inc that you need
to set properly or BLACS won't work properly with Open MPI.

On Feb 20, 2007, at 4:25 AM, Kobotov, Alexander V wrote:

> Hello all,
>
>
>
> I built BLACS on Itanium using Intel compilers under linux
> (2.6.9-34.EL). But it fails default BLACS Fortran tests (xFbtest),
> C tests (xCbtest) are ok. I’ve tried different configurations
> combining OpenMPI-1.1.2 or OpenMPI-1.1.4, ICC 9.1.038 or ICC
> 8.1.38, IFORT 9.1.33 or IFORT 8.1.34, but all results were the
> same. OpenMPI is built using 9.1 compilers. Also I’ve tried the
> same using em64t compiler on Intel XEON – all tests were passed.
> MPICH2 on IPF also works fine.
>
>
>
> Is that an OpenMPI bug? Maybe some workaround exists?
>
>
>
> Bmake.inc is attached.
>
> Below is output I’ve got (Don’t pay attention to blacs warnings,
> they are normal for MPI):
>
> ===[ begin of: xFbtest output ]=====================================
>
> -bash-3.00$ mpirun -np 4 xFbtest_MPI-LINUX-0
>
> BLACS WARNING 'No need to set message ID range due to MPI
> communicator.'
>
> from {-1,-1}, pnum=1, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>
>
>
> BLACS WARNING 'No need to set message ID range due to MPI
> communicator.'
>
> from {-1,-1}, pnum=3, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>
>
>
> BLACS WARNING 'No need to set message ID range due to MPI
> communicator.'
>
> from {-1,-1}, pnum=0, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>
>
>
> BLACS WARNING 'No need to set message ID range due to MPI
> communicator.'
>
> from {-1,-1}, pnum=2, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>
>
>
> [comp-pvfs-0-7.local:30119] *** An error occurred in MPI_Comm_group
>
> [comp-pvfs-0-7.local:30118] *** An error occurred in MPI_Comm_group
>
> [comp-pvfs-0-7.local:30118] *** on communicator MPI_COMM_WORLD
>
> [comp-pvfs-0-7.local:30118] *** MPI_ERR_COMM: invalid communicator
>
> [comp-pvfs-0-7.local:30119] *** on communicator MPI_COMM_WORLD
>
> [comp-pvfs-0-7.local:30119] *** MPI_ERR_COMM: invalid communicator
>
> [comp-pvfs-0-7.local:30119] *** MPI_ERRORS_ARE_FATAL (goodbye)
>
> [comp-pvfs-0-7.local:30116] *** An error occurred in MPI_Comm_group
>
> [comp-pvfs-0-7.local:30116] *** on communicator MPI_COMM_WORLD
>
> [comp-pvfs-0-7.local:30118] *** MPI_ERRORS_ARE_FATAL (goodbye)
>
> [comp-pvfs-0-7.local:30116] *** MPI_ERR_COMM: invalid communicator
>
> [comp-pvfs-0-7.local:30116] *** MPI_ERRORS_ARE_FATAL (goodbye)
>
> [comp-pvfs-0-7.local:30117] *** An error occurred in MPI_Comm_group
>
> [comp-pvfs-0-7.local:30117] *** on communicator MPI_COMM_WORLD
>
> [comp-pvfs-0-7.local:30117] *** MPI_ERR_COMM: invalid communicator
>
> [comp-pvfs-0-7.local:30117] *** MPI_ERRORS_ARE_FATAL (goodbye)
>
> forrtl: error (78): process killed (SIGTERM)
>
> forrtl: error (78): process killed (SIGTERM)
>
> forrtl: error (78): process killed (SIGTERM)
>
> forrtl: error (78): process killed (SIGTERM)
>
> ===[ end of: xFbtest output ]=====================================
>
>
>
>
>
> W.B.R.,
>
> Kobotov Alexander
>
>
>
>
>
> <Bmake.inc>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

-- 
Jeff Squyres
Server Virtualization Business Unit
Cisco Systems