Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] 50% performance reduction due toOpenMPI v 1.3.2 forcing all MPI traffic over Ethernet insteadof using Infiniband
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2009-06-23 19:29:11


You mentioned that you only have a binary for your executable. Was it
compiled / linked against v1.3.2?

We did not introduce ABI compatibility until v1.3.2 -- if the
executable was compiled/linked against any version prior to that, it's
pure luck that it works with the 1.3.2 shared libraries at all.

On Jun 23, 2009, at 7:25 PM, Jim Kress ORG wrote:

> This is what I get
>
> [root_at_master ~]# ompi_info | grep openib
> MCA btl: openib (MCA v2.0, API v2.0, Component
> v1.3.2)
> [root_at_master ~]#
>
> Jim
>
>
> On Tue, 2009-06-23 at 18:51 -0400, Jeff Squyres wrote:
> > openib (OpenFabrics) plugin is installed
> > and at least marginally opera
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>

-- 
Jeff Squyres
Cisco Systems