Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2005-11-08 12:16:38


We discovered this exact same problem just a few days ago. It was
fixed in r8010 (slightly after your version). Would you mind updating
and trying again? (thanks for your patience...)

There are also some datatype fixes pending on the trunk right now that
have not yet been fully vetted and brought over to the v1.0 branch. As
far as I know, they do not affect the NASA Overflow code, though.

On Nov 8, 2005, at 11:29 AM, Borenstein, Bernard S wrote:

> I am again trying to build and run the Nasa Overflow 1.8ab version
> using Open MPI and have run into this error message :
>
> [hsd653:05053] *** An error occurred in MPI_Allreduce: the reduction
> operation M
>
> PI_OP_MIN is not defined on the MPI_DBLPREC datatype
>
> [hsd653:05053] *** on communicator MPI_COMM_WORLD
>
> [hsd653:05053] *** MPI_ERR_OP: invalid reduce operation
>
> [hsd653:05053] *** MPI_ERRORS_ARE_FATAL (goodbye)
>
> [hsd652:04317] *** An error occurred in MPI_Allreduce: the reduction
> operation M
>
> PI_OP_MIN is not defined on the MPI_DBLPREC datatype
>
> [hsd652:04317] *** on communicator MPI_COMM_WORLD
>
> [hsd652:04317] *** MPI_ERR_OP: invalid reduce operation
>
> [hsd652:04317] *** MPI_ERRORS_ARE_FATAL (goodbye)
>
> mpirun: killing job...
>
> -----------------------------------------------------------------------
> ---
>
> WARNING: mpirun encountered an abnormal exit.
>
> This means that mpirun exited before it received notification that all
>
> started processes had terminated.  You should double check and ensure
>
> that there are no runaway processes still executing.
>
> Any ideas why this is happening??
>
> Bernie Borenstein
>
> The Boeing Company
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

-- 
{+} Jeff Squyres
{+} The Open MPI Project
{+} http://www.open-mpi.org/