Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

From: Jeff Squyres \(jsquyres\) (jsquyres_at_[hidden])
Date: 2006-04-08 07:22:40


John --

I am unable to replicate your problem with Open MPI 1.0.2 with the
following simple program:

-----
      program ring_f77
      include 'mpif.h'
      integer n, rank, size, ierr
      real*8 data2(10), data(10)

      call mpi_init(ierr)

      call mpi_comm_rank(MPI_COMM_WORLD, rank, ierr)
      call mpi_comm_size(MPI_COMM_WORLD, size, ierr)

      n = 10
      call MPI_Allreduce( data, data2, n, MPI_DOUBLE_PRECISION,
     & MPI_SUM, MPI_COMM_WORLD, ierr )

      call mpi_finalize(ierr)
      end
-----

Can you replicate the problem with this small program?

> -----Original Message-----
> From: users-bounces_at_[hidden]
> [mailto:users-bounces_at_[hidden]] On Behalf Of john casu
> Sent: Thursday, April 06, 2006 6:07 PM
> To: Open MPI Users
> Subject: Re: [OMPI users] MPI_Allreduce error in 1.0.1 and 1.0.2rc1
>
> On Thu, 2006-04-06 at 15:48 -0400, George Bosilca wrote:
> > The error state that your trying to use an invalid operation. MPI
> > define which operation can be applied on which predefined
> data-types.
> > Do you know which operation is used there ? And which
> predefined data-
> > type ?
> >
>
> please forgive for not including the relevant offending Fortran code,
> which is as follows:
>
>
> #include "mpif.h"
> :
> :
> real*8 data2(10), data(10)
> integer msg_comm
> integer n, ierr
> :
> :
> call MPI_Allreduce( data, data2, n, MPI_DOUBLE_PRECISION,
> & MPI_SUM, msg_comm, ierr )
>
>
> Further investigation shows that the error only seems to show
> itself on
> an x86_64 box, and the code works fine on an ia32 box.
>
> > Thanks,
> > george.
> >
> > On Apr 6, 2006, at 2:41 PM, john casu wrote:
> >
> > > I'm trying to work with the sppm code form LLNL:
> > >
> http://www.llnl.gov/asci_benchmarks/asci/limited/ppm/asci_sppm.html
> > >
> > > I built openmpi and sppm on an 8-way shared memory Linux box.
> > >
> > > The error I get is:
> > > [ty20:07732] *** An error occurred in MPI_Allreduce
> > > [ty20:07732] *** on communicator MPI_COMM_WORLD
> > > [ty20:07732] *** MPI_ERR_OP: invalid reduce operation
> > > [ty20:07732] *** MPI_ERRORS_ARE_FATAL (goodbye)
> > >
> > >
> > > it happens in both 1.0.1 and 1.0.2rc1
> > >
> > > thanks,
> > >
> > > john.
> > >
> > > _______________________________________________
> > > users mailing list
> > > users_at_[hidden]
> > > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>