Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] MPI_Allreduce on local machine
From: Gus Correa (gus_at_[hidden])
Date: 2010-07-28 11:48:46


Hi Hugo, Jeff, list

Hugo: I think David Zhang's suggestion was to use
MPI_REAL8 not MPI_REAL, instead of MPI_DOUBLE_PRECISION in your
MPI_Allreduce call.

Still, to me it looks like OpenMPI is making double precision 4-byte
long, which shorter than I expected it be (8 bytes),
at least when looking at the output of ompi_info --all.

I always get get a size 4 for dbl prec in my x86_64 machine,
from ompi_info --all.
I confirmed this in six builds of OpenMPI 1.4.2: gcc+gfortran,
gcc+pgf90, gcc+ifort, icc+ifort, pgcc+pgf90, and opencc+openf95.
Although the output of ompi_info never says this is actually the size
of MPI_DOUBLE_PRECISION, just of "dbl prec", which is a bit ambiguous.

FWIW, I include the output below. Note that alignment for gcc+ifort
is 1, all others are 4.

Jeff: Is this correct?

Thanks,
Gus Correa

$ openmpi/1.4.2/open64-4.2.3-0/bin/ompi_info --all | grep -i dbl
       Fort dbl prec size: 4
       Fort dbl cplx size: 4
      Fort dbl prec align: 4
      Fort dbl cplx align: 4
$ openmpi/1.4.2/gnu-4.1.2/bin/ompi_info --all | grep -i dbl
       Fort dbl prec size: 4
       Fort dbl cplx size: 4
      Fort dbl prec align: 4
      Fort dbl cplx align: 4
$ openmpi/1.4.2/gnu-4.1.2-intel-10.1.017/bin/ompi_info --all | grep -i dbl
       Fort dbl prec size: 4
       Fort dbl cplx size: 4
      Fort dbl prec align: 1
      Fort dbl cplx align: 1
$ openmpi/1.4.2/gnu-4.1.2-pgi-8.0-4/bin/ompi_info --all | grep -i dbl
       Fort dbl prec size: 4
       Fort dbl cplx size: 4
      Fort dbl prec align: 4
      Fort dbl cplx align: 4
$ openmpi/1.4.2/pgi-8.0-4/bin/ompi_info --all | grep -i dbl
       Fort dbl prec size: 4
       Fort dbl cplx size: 4
      Fort dbl prec align: 4
      Fort dbl cplx align: 4

Hugo Gagnon wrote:
> And how do I know how big my data buffer is? I ran MPI_TYPE_EXTENT of
> And how do I know how big my data buffer is? I ran MPI_TYPE_EXTENT of
> MPI_DOUBLE_PRECISION and the result was 8. So I changed my program to:
>
> 1 program test
> 2
> 3 use mpi
> 4
> 5 implicit none
> 6
> 7 integer :: ierr, nproc, myrank
> 8 !integer, parameter :: dp = kind(1.d0)
> 9 real(kind=8) :: inside(5), outside(5)
> 10
> 11 call mpi_init(ierr)
> 12 call mpi_comm_size(mpi_comm_world, nproc, ierr)
> 13 call mpi_comm_rank(mpi_comm_world, myrank, ierr)
> 14
> 15 inside = (/ 1., 2., 3., 4., 5. /)
> 16 call mpi_allreduce(inside, outside, 5, mpi_real, mpi_sum,
> mpi_comm_world, ierr)
> 17
> 18 if (myrank == 0) then
> 19 print*, outside
> 20 end if
> 21
> 22 call mpi_finalize(ierr)
> 23
> 24 end program test
>
> but I still get a SIGSEGV fault:
>
> forrtl: severe (174): SIGSEGV, segmentation fault occurred
> Image PC Routine Line
> Source
> libmpi.0.dylib 00000001001BB4B7 Unknown Unknown
> Unknown
> libmpi_f77.0.dyli 00000001000AF046 Unknown Unknown
> Unknown
> a.out 0000000100000D87 _MAIN__ 16
> test.f90
> a.out 0000000100000C9C Unknown Unknown
> Unknown
> a.out 0000000100000C34 Unknown Unknown
> Unknown
> forrtl: severe (174): SIGSEGV, segmentation fault occurred
> Image PC Routine Line
> Source
> libmpi.0.dylib 00000001001BB4B7 Unknown Unknown
> Unknown
> libmpi_f77.0.dyli 00000001000AF046 Unknown Unknown
> Unknown
> a.out 0000000100000D87 _MAIN__ 16
> test.f90
> a.out 0000000100000C9C Unknown Unknown
> Unknown
> a.out 0000000100000C34 Unknown Unknown
> Unknown
>
> What is wrong now?