Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] MPI_IN_PLACE in Fortran with MPI_REDUCE / MPI_ALLREDUCE
From: Ricardo Fonseca (ricardo.fonseca_at_[hidden])
Date: 2009-07-27 09:42:53


Hi guys

I'm having a little trouble using MPI_IN_PLACE with MPI_REDUCE /
MPI_ALLREDUCE in Fortran. If I try to MPI_IN_PLACE with C bindings it
works fine running on 2 nodes:

Result:
3.000000 3.000000 3.000000 3.000000

Regardless of using MPI_Reduce or MPI_Allreduce. However, this fails
for the fortran version:

  Result:
    2.000000 2.000000 2.000000 2.000000

Apparently, MPI is ignoring values from the root node. Here's the
source for the Fortran code:

---
program inplace
   use mpi
   implicit none
   integer :: ierr, rank, rsize, bsize
   real, dimension( 2, 2 ) :: buffer, out
   integer :: rc
   call MPI_INIT(ierr)
   call MPI_COMM_RANK(MPI_COMM_WORLD, rank, ierr)
   call MPI_COMM_SIZE(MPI_COMM_WORLD, rsize, ierr)
   buffer = rank + 1
   bsize = size(buffer,1) * size(buffer,2)
   if ( rank == 0 ) then
     call mpi_reduce( MPI_IN_PLACE, buffer, bsize, MPI_REAL, MPI_SUM,  
0, MPI_COMM_WORLD, ierr )
   else
     call mpi_reduce( buffer,       0,      bsize, MPI_REAL, MPI_SUM,  
0, MPI_COMM_WORLD, ierr )
   endif
   ! use allreduce instead
   ! call mpi_allreduce( MPI_IN_PLACE, buffer, bsize, MPI_REAL,  
MPI_SUM, MPI_COMM_WORLD, ierr )
   if ( rank == 0 ) then
     print *, 'Result:'
     print *, buffer
   endif
   rc = 0
   call mpi_finalize( rc )
end program
---
Any ideas?
Cheers,
Ricardo
---
Prof. Ricardo Fonseca
GoLP - Grupo de Lasers e Plasmas
Instituto de Plasmas e Fusão Nuclear
Instituto Superior Técnico
Av. Rovisco Pais
1049-001 Lisboa
Portugal
tel: +351 21 8419202
fax: +351 21 8464455
web: http://cfp.ist.utl.pt/golp/