Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Can OpenMPI support multiple compilers?
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2008-10-22 11:49:40

Also check the FAQ on how to use the wrapper compilers -- there are
ways to override at compile time, but be warned that it's not always
what you want. As Terry indicates, you probably want to have multiple
OMPI installations -- one for each compiler.

In particular, there are problems with mixing multiple Fortran
compilers noted in OMPI's README file:

- Open MPI will build bindings suitable for all common forms of
   Fortran 77 compiler symbol mangling on platforms that support it
   (e.g., Linux). On platforms that do not support weak symbols (e.g.,
   OS X), Open MPI will build Fortran 77 bindings just for the compiler
   that Open MPI was configured with.

   Hence, on platforms that support it, if you configure Open MPI with
   a Fortran 77 compiler that uses one symbol mangling scheme, you can
   successfully compile and link MPI Fortran 77 applications with a
   Fortran 77 compiler that uses a different symbol mangling scheme.

   NOTE: For platforms that support the multi-Fortran-compiler bindings
   (i.e., weak symbols are supported), due to limitations in the MPI
   standard and in Fortran compilers, it is not possible to hide these
   differences in all cases. Specifically, the following two cases may
   not be portable between different Fortran compilers:

      will only compare properly to Fortran applications that were
      created with Fortran compilers that that use the same
      name-mangling scheme as the Fortran compiler that Open MPI was
      configured with.

   2. Fortran compilers may have different values for the logical
      .TRUE. constant. As such, any MPI function that uses the Fortran
      LOGICAL type may only get .TRUE. values back that correspond to
      the the .TRUE. value of the Fortran compiler that Open MPI was
      configured with. Note that some Fortran compilers allow forcing
      .TRUE. to be 1 and .FALSE. to be 0. For example, the Portland
      Group compilers provide the "-Munixlogical" option, and Intel
      compilers (version >= 8.) provide the "-fpscomp logicals" option.

   You can use the ompi_info command to see the Fortran compiler that
   Open MPI was configured with.

On Oct 19, 2008, at 8:34 PM, Terry Frankcombe wrote:

> It happily supports multiple compilers on the same system, but not in
> the way you mean. You need another installation of OMPI (in,
> say, /usr/lib64/mpi/intel) for icc/ifort.
> Select by path manipulation.
> On Mon, 2008-10-20 at 08:19 +0800, Wen Hao Wang wrote:
>> Hi all:
>> I have openmpi 1.2.5 installed on SLES10 SP2. These packages should
>> be
>> compiled with gcc compilers. Now I have installed Intel C++ and
>> Fortran compilers on my cluster. Can openmpi use Intel compilers
>> withour recompiling?
>> I tried to use environment variable to indicate Intel compiler, but
>> it
>> seems the mpi commands still wanted to use gcc ones.
>> LS21-08:/opt/intel/fce/10.1.018/bin # mpif77 --showme
>> gfortran -I/usr/lib64/mpi/gcc/openmpi/include -pthread
>> -L/usr/lib64/mpi/gcc/openmpi/lib64 -lmpi_f77 -lmpi -lopen-rte
>> -lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldl
>> LS21-08:/opt/intel/fce/10.1.018/bin # export
>> F77=/opt/intel/fce/10.1.018/bin/ifort
>> LS21-08:/opt/intel/fce/10.1.018/bin # rpm -e
>> gcc-fortran-4.1.2_20070115-0.21
>> LS21-08:/opt/intel/fce/10.1.018/bin # mpif77 /LTC/matmul-for-intel.f
>> --------------------------------------------------------------------------
>> The Open MPI wrapper compiler was unable to find the specified
>> compiler
>> gfortran in your PATH.
>> Note that this compiler was either specified at configure time or in
>> one of several possible environment variables.
>> --------------------------------------------------------------------------
>> Is it possible to change openmpi's underlying compiler? Thus I can
>> use
>> multiple compilers on one machine.
>> Thanks in advance!
>> Steven Wang
>> Email: wangwhao_at_[hidden]
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
> _______________________________________________
> users mailing list
> users_at_[hidden]

Jeff Squyres
Cisco Systems