Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] MPI_sendrecv = MPI_Send+ MPI_RECV ?
From: Enrico Barausse (enrico.barausse_at_[hidden])
Date: 2008-09-14 07:13:35


sorry, I hadn't changed the subject. I'm reposting:

Hi

I think it's correct. what I want to to is to send a 3d array from the
process 1 to process 0 =root):
call MPI_Send(toroot,3,MPI_DOUBLE_PRECISION,root,n,MPI_COMM_WORLD

in some other part of the code process 0 acts on the 3d array and
turns it into a 4d one and sends it back to process 1, which receives
it with

call MPI_RECV(tonode,4,MPI_DOUBLE_PRECISION,root,n,MPI_COMM_WORLD,status,ierr)

in practice, what I do i basically give by this simple code (which
doesn't give the segmentation fault unfortunately):

       a=(/1,2,3,4,5/)

       call MPI_INIT(ierr)
       call MPI_COMM_RANK(MPI_COMM_WORLD, id, ierr)
       call MPI_COMM_SIZE(MPI_COMM_WORLD, numprocs, ierr)

       if(numprocs/=2) stop

       if(id==0) then
               do k=1,5
                       a=a+1
                       call MPI_SEND(a,5,MPI_INTEGER,1,k,MPI_COMM_WORLD,ierr)
                       call
MPI_RECV(b,4,MPI_INTEGER,1,k,MPI_COMM_WORLD,status,ierr)
               end do
       else
               do k=1,5
                       call
MPI_RECV(a,5,MPI_INTEGER,0,k,MPI_COMM_WORLD,status,ierr)
                       b=a(1:4)
                       call MPI_SEND(b,4,MPI_INTEGER,0,k,MPI_COMM_WORLD,ierr)
               end do
       end if