Open MPI logo

Open MPI Development Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Development mailing list

Subject: Re: [OMPI devel] Open-MPI build of NAMD launched from srun over 20% slowed than with mpirun
From: Christopher Samuel (samuel_at_[hidden])
Date: 2013-09-03 23:59:40


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 04/09/13 11:29, Ralph Castain wrote:

> Your code is obviously doing something much more than just
> launching and wiring up, so it is difficult to assess the
> difference in speed between 1.6.5 and 1.7.3 - my guess is that it
> has to do with changes in the MPI transport layer and nothing to do
> with PMI or not.

I'm testing with what would be our most used application in aggregate
across our systems, the NAMD molecular dynamics code from here:

http://www.ks.uiuc.edu/Research/namd/

so yes, you're quite right, it's doing a lot more than that and has a
reputation for being a *very* chatty MPI code.

For comparison whilst users see GROMACS also suffer with srun under
1.6.5 they don't see anything like the slow down that NAMD gets.

All the best,
Chris
- --
 Christopher Samuel Senior Systems Administrator
 VLSCI - Victorian Life Sciences Computation Initiative
 Email: samuel_at_[hidden] Phone: +61 (0)3 903 55545
 http://www.vlsci.org.au/ http://twitter.com/vlsci

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.11 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/

iEYEARECAAYFAlImsCwACgkQO2KABBYQAh8c4wCfQlOd6ZL68tncAd1h3Fyb1hAr
DicAn06seL8GzYPGtGImnYkb7sYd5op9
=pkwZ
-----END PGP SIGNATURE-----