Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] pgi and gcc runtime compatability
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2008-12-10 16:36:20


FWIW, the README describes some of the Fortran issues:

- Open MPI will build bindings suitable for all common forms of
   Fortran 77 compiler symbol mangling on platforms that support it
   (e.g., Linux). On platforms that do not support weak symbols (e.g.,
   OS X), Open MPI will build Fortran 77 bindings just for the compiler
   that Open MPI was configured with.

   Hence, on platforms that support it, if you configure Open MPI with
   a Fortran 77 compiler that uses one symbol mangling scheme, you can
   successfully compile and link MPI Fortran 77 applications with a
   Fortran 77 compiler that uses a different symbol mangling scheme.

   NOTE: For platforms that support the multi-Fortran-compiler bindings
   (i.e., weak symbols are supported), due to limitations in the MPI
   standard and in Fortran compilers, it is not possible to hide these
   differences in all cases. Specifically, the following two cases may
   not be portable between different Fortran compilers:

   1. The C constants MPI_F_STATUS_IGNORE and MPI_F_STATUSES_IGNORE
      will only compare properly to Fortran applications that were
      created with Fortran compilers that that use the same
      name-mangling scheme as the Fortran compiler that Open MPI was
      configured with.

   2. Fortran compilers may have different values for the logical
      .TRUE. constant. As such, any MPI function that uses the Fortran
      LOGICAL type may only get .TRUE. values back that correspond to
      the the .TRUE. value of the Fortran compiler that Open MPI was
      configured with. Note that some Fortran compilers allow forcing
      .TRUE. to be 1 and .FALSE. to be 0. For example, the Portland
      Group compilers provide the "-Munixlogical" option, and Intel
      compilers (version >= 8.) provide the "-fpscomp logicals" option.

   You can use the ompi_info command to see the Fortran compiler that
   Open MPI was configured with.

On Dec 8, 2008, at 11:46 AM, Brock Palen wrote:

> Looks like the same source tree was used cleaned (distclean). So I
> don't have config logs for gcc or pgi.. Also I can't find
> opal_confg.h in ether the configured/built source or installed
> location,
>
> 1.2.8+pgi, This library was found to run an executable built with
> 1.2.6+gcc
>
> Sorry I don't have the files you requested George.
>
> Brock Palen
> www.umich.edu/~brockp
> Center for Advanced Computing
> brockp_at_[hidden]
> (734)936-1985
>
>
>
> On Dec 8, 2008, at 11:31 AM, George Bosilca wrote:
>
>> Black magic happens all the time. To keep it simple, we do not
>> expect different compilers to be 100% compatible, so this is
>> completely unsupported by the Open MPI community. Moreover, we
>> already know some compilers that claim gcc compatibility, when
>> there are always some [obscure] things that don't really match
>> (hint icc and gcc).
>>
>> For Fortran there are even more issues. One was already hinted in a
>> one of the answers (logical), but more are expected such as the
>> representation of the "strange" type REAL16 and REAL32 (and the
>> corresponding COMPLEX types). I'm sure more can be found, but these
>> are enough not to support the cross-compilers stuff.
>>
>> Now, I'm really curious that this worked. Do you have access to the
>> opal_config.h file for the pgi and gcc build ? Or to the config.log
>> files ? If yes can you share it with us please.
>>
>> Thanks,
>> george.
>>
>> On Dec 7, 2008, at 22:06 , Brock Palen wrote:
>>
>>> I did something today that I was happy worked, but I want to know
>>> if anyone has had problem with it.
>>>
>>> At runtime. (not compiling) would a OpenMPI built with pgi work
>>> to run a code that was compiled with the same version but gcc
>>> built OpenMPI ? I tested a few apps today after I accidentally
>>> did this and found it worked. They were all C/C++ apps (namd and
>>> gromacs) but what about fortran apps? Should we expect problems
>>> if someone does this?
>>>
>>> I am not going to encourage this, but it is more if needed.
>>>
>>>
>>> Brock Palen
>>> www.umich.edu/~brockp
>>> Center for Advanced Computing
>>> brockp_at_[hidden]
>>> (734)936-1985
>>>
>>>
>>>
>>> _______________________________________________
>>> users mailing list
>>> users_at_[hidden]
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

-- 
Jeff Squyres
Cisco Systems