Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: Re: [OMPI users] Error occurred in MPI_Allreduce on communicator MPI_COMM_WORLD
From: hi (hiralsmaillist_at_[hidden])
Date: 2011-05-06 08:28:30

Hi Peter,

Thanks that helped a lot.
BTW: please let me know if you any comment on "Windows:
MPI_Allreduce() crashes when using MPI_DOUBLE_PRECISION" thread.

Thank you.

2011/5/4 Peter Kjellström <cap_at_[hidden]>:
> On Wednesday, May 04, 2011 04:04:37 PM hi wrote:
>> Greetings !!!
>> I am observing following error messages when executing attached test
>> program...
>> C:\test>mpirun mar_f.exe
> ...
>> [vbgyor:9920] *** An error occurred in MPI_Allreduce
>> [vbgyor:9920] *** on communicator MPI_COMM_WORLD
>> [vbgyor:9920] *** MPI_ERR_OP: invalid reduce operation
>> [vbgyor:9920] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
> I'm not a fortran programmer but it seems to me that placing the MPI_Allreduce
> call in a subroutine like that broke the meaning of MPI_SUM and MPI_REAL in
> that scope. Adding:
>  include 'mpif.h'
> after SUBROUTINE PAR_BLAS2(m, n, a, b, c, comm) helps.
> /Peter
> _______________________________________________
> users mailing list
> users_at_[hidden]