Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] [EXTERNAL] Re: What version of PMI (Cray XE6) is working for OpenMPI-1.6.5?
From: Hjelm, Nathan T (hjelmn_at_[hidden])
Date: 2013-08-30 11:49:47


Replace install_path to where you want Open MPI installed.

./configure --prefix=install_path --with-platform=contrib/platform/lanl/cray_xe6/optimized-luster
make
make install

To use Open MPI just set the PATH and LD_LIBRARY_PATH:

PATH=install_path/bin:$PATH
LD_LIBRARY_PATH=install_path/lib:$LD_LIBRARY_PATH

You can then use mpicc, mpicxx, mpif90, etc to compile and either mpirun or aprun to run. If you are running at scale I would recommend against using aprun for now. I also recommend you change your programming environment to either PrgEnv-gnu or PrgEnv-intel. The PGI compiler can be a PIA. It is possible to build with the Cray compiler but it takes patching the config.guess and changing some autoconf stuff.

-Nathan

Please excuse the horrible Outlook-style quoting.
________________________________________
From: users [users-bounces_at_[hidden]] on behalf of Teranishi, Keita [knteran_at_[hidden]]
Sent: Thursday, August 29, 2013 8:01 PM
To: Open MPI Users
Subject: Re: [OMPI users] [EXTERNAL] Re: What version of PMI (Cray XE6) is working for OpenMPI-1.6.5?

Thanks for the info. Is it still possible to build by myself? What is
the procedure other than configure script?

On 8/23/13 2:37 PM, "Nathan Hjelm" <hjelmn_at_[hidden]> wrote:

>On Fri, Aug 23, 2013 at 09:14:25PM +0000, Teranishi, Keita wrote:
>> Hi,
>> I am trying to install OpenMPI 1.6.5 on Cray XE6 and very curious
>>with the
>> current support of PMI. In the previous discussions, there was a
>>comment
>> on the version of PMI (it works with 2.1.4, but fails with 3.0). Our
>
>Open MPI 1.6.5 does not have support for the XE-6. Use 1.7.2 instead.
>
>> machine has PMI2.1.4 and PMI4.0 (default). Which version do you
>
>There was a regression in PMI 3.x.x that still exists in 4.0.x that
>causes a warning to be printed on every rank when using mpirun. We are
>working with Cray to resolve the issue. For now use 2.1.4. See the
>platform files in contrib/platform/lanl/cray_xe6. The platform files you
>would want to use are debug-lustre or optimized-lusre.
>
>BTW, 1.7.2 is installed on Cielo and Cielito. Just run:
>
>module swap PrgEnv-pgi PrgEnv-gnu (PrgEnv-intel also works)
>module unload cray-mpich2 xt-libsci
>module load openmpi/1.7.2
>
>
>-Nathan Hjelm
>Open MPI Team, HPC-3, LANL
>_______________________________________________
>users mailing list
>users_at_[hidden]
>http://www.open-mpi.org/mailman/listinfo.cgi/users

_______________________________________________
users mailing list
users_at_[hidden]
http://www.open-mpi.org/mailman/listinfo.cgi/users