Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] MPI_COMPLEX16
From: David Singleton (David.Singleton_at_[hidden])
Date: 2012-04-26 18:54:00


I should have checked earlier - same for MPI_COMPLEX and MPI_COMPLEX8.

David

On 04/27/2012 08:43 AM, David Singleton wrote:
>
> Apologies if this has already been covered somewhere. One of our users
> has noticed that MPI_COMPLEX16 is flagged as an invalid type in 1.5.4
> but not in 1.4.3 while MPI_DOUBLE_COMPLEX is accepted for both. This is
> with either gfortran or intel-fc. Superficially, the configure looks
> the same for 1.4.3 and 1.5.4, eg.
> % grep COMPLEX16 opal/include/opal_config.h
> #define OMPI_HAVE_F90_COMPLEX16 1
> #define OMPI_HAVE_FORTRAN_COMPLEX16 1
>
> Their test code (appended below) produces:
>
> % module load openmpi/1.4.3
> % mpif90 mpi_complex_test.f90
> % mpirun -np 2 ./a.out
> SUM1 (3.00000000000000,-1.00000000000000)
> SUM2 (3.00000000000000,-1.00000000000000)
> % module swap openmpi/1.5.4
> % mpif90 mpi_complex_test.f90
> % mpirun -np 2 ./a.out
> [vayu1:1935] *** An error occurred in MPI_Reduce
> [vayu1:1935] *** on communicator MPI_COMM_WORLD
> [vayu1:1935] *** MPI_ERR_TYPE: invalid datatype
> [vayu1:1935] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
> SUM1 (3.00000000000000,-1.00000000000000)
>
> Thanks for any help,
> David
>
>
> program mpi_test
>
> implicit none
> include 'mpif.h'
> integer, parameter :: r8 = selected_real_kind(12)
> complex(kind=r8) :: local, global
> integer :: ierr, myid, nproc
>
> call MPI_INIT (ierr)
> call MPI_COMM_RANK (MPI_COMM_WORLD, myid, ierr)
> call MPI_COMM_SIZE (MPI_COMM_WORLD, nproc, ierr)
>
> local = cmplx(myid+1.0, myid-1.0, kind=r8)
> call MPI_REDUCE (local, global, 1, MPI_DOUBLE_COMPLEX, MPI_SUM, 0, &
> MPI_COMM_WORLD, ierr)
> if ( myid == 0 ) then
> print*, 'SUM1', global
> end if
>
> call MPI_REDUCE (local, global, 1, MPI_COMPLEX16, MPI_SUM, 0, &
> MPI_COMM_WORLD, ierr)
> if ( myid == 0 ) then
> print*, 'SUM2', global
> end if
>
> call MPI_FINALIZE (ierr)
>
> end program mpi_test
>