Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: George Bosilca (bosilca_at_[hidden])
Date: 2007-04-17 16:55:54


There were several improvements for TCP in the 1.2. We expect it to
behave in a more optimized way compared with the 1.1.x. Can you share
your code so I can take a look to see what's happening ?

   Thanks,
     george.

On Apr 17, 2007, at 4:30 PM, Michael wrote:

> To maintain compatibility with a major HPC center I upgraded(?) from
> OpenMPI 1.1.4 to OpenMPI 1.2 on my local cluster.
>
> In testing on my local cluster, 13 dual-Opteron Linux boxes with dual
> gigabit ethernet, I discovered that my program runs slower using
> OpenMPI 1.2 then OpenMPI 1.1.4 (780.3 versus 402.4 seconds with 3
> processes--tested twice to be certain).
>
> This particular version of my program was designed to minimize the
> amount of communications and the only MPI calls that get used a lot
> are MPI_SEND and MPI_RECV with MPI_PACKED data (so MPI_PACK and
> MPI_UNPACK also get used a lot).
>
> Was there a known problem with OpenMPI 1.2 (r14027) and ethernet
> communication that got fixed later?
>
> The same executable run at the major center seems fine, but they have
> Myrinet.
>
> Michael
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users



  • application/pkcs7-signature attachment: smime.p7s