Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Res: Gromacs run in parallel
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2010-06-08 09:30:03


No, I'm sorry -- I wasn't clear. What I meant was, that if you run:

   mpirun -np 4 my_mpi_application

1. If you see a single, 4-process MPI job (regardless of how many nodes/servers it's spread across), then all is good. This is what you want.

2. But if you're seeing 4 independent 1-process MPI jobs (again, regardless of how many nodes/servers they are spread across), it's possible that you compiled your application with MPI implementation X and then used the "mpirun" from MPI implementation Y.

You will need X==Y to make it work properly -- i.e., to see case #1, above. I mention this because your first post mentioned that you're seeing the same job run 4 times. This implied to me that you are running into case #2. If I misunderstood your problem, then ignore me and forgive the noise.

On Jun 8, 2010, at 9:20 AM, Carsten Kutzner wrote:

> On Jun 8, 2010, at 3:06 PM, Jeff Squyres wrote:
>
> > I know nothing about Gromacs, but you might want to ensure that your Gromacs was compiled with Open MPI. A common symptom of "mpirun -np 4 my_mpi_application" running 4 1-process MPI jobs (instead of 1 4-process MPI job) is that you compiled my_mpi_application with one MPI implementation, but then used the mpirun from a different MPI implementation.
> >
> Hi,
>
> this can be checked by looking at the Gromacs output file md.log. The second line should
> read something like
>
> Host: <somename> pid: <somepid> nodeid: 0 nnodes: 4
>
> Lauren, you will want to ensure that nnodes is 4 in your case, and not 1.
>
> You can also easily test that without any input file by typing
>
> mpirun -np 4 mdrun -h
>
> and then should see
>
> NNODES=4, MYRANK=1, HOSTNAME=<...>
> NNODES=4, MYRANK=2, HOSTNAME=<...>
> NNODES=4, MYRANK=3, HOSTNAME=<...>
> NNODES=4, MYRANK=4, HOSTNAME=<...>
> ...
>
>
> Carsten
>
>
> >
> > On Jun 8, 2010, at 8:59 AM, lauren wrote:
> >
> >>
> >> The version of Gromacs is 4.0.7.
> >> This is the first time that I using Gromacs, then excuse me if I'm nonsense.
> >>
> >> Wich part of md.log output should I post?
> >> after or before the input description?
> >>
> >> thanks for all,
> >> and sorry
> >>
> >> De: Carsten Kutzner <ckutzne_at_[hidden]>
> >> Para: Open MPI Users <users_at_[hidden]>
> >> Enviadas: Domingo, 6 de Junho de 2010 9:51:26
> >> Assunto: Re: [OMPI users] Gromacs run in parallel
> >>
> >> Hi,
> >>
> >> which version of Gromacs is this? Could you post the first lines of
> >> the md.log output file?
> >>
> >> Carsten
> >>
> >>
> >> On Jun 5, 2010, at 10:23 PM, lauren wrote:
> >>
> >>> sorry my english..
> >>>
> >>> I want to know how can I run Gromancs in parallel!
> >>> Because when I used
> >>>
> >>> mdrun &
> >>> mpiexec -np 4 mdrun_mpi -v -deffnm em
> >>>
> >>> to run the minimization in 4 cores > all cores make the same job, again!
> >>> They don't run together.
> >>> I want all in parallel make the job faster.
> >>>
> >>>
> >>> what could be wrong?
> >>>
> >>> thank's a lot!
> >>>
> >>>
> >>>
> >>> _______________________________________________
> >>> users mailing list
> >>> users_at_[hidden]
> >>> http://www.open-mpi.org/mailman/listinfo.cgi/users
> >>
> >>
> >>
> >> _______________________________________________
> >> users mailing list
> >> users_at_[hidden]
> >> http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> >
> > --
> > Jeff Squyres
> > jsquyres_at_[hidden]
> > For corporate legal information go to:
> > http://www.cisco.com/web/about/doing_business/legal/cri/
> >
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
>
>
> --
> Dr. Carsten Kutzner
> Max Planck Institute for Biophysical Chemistry
> Theoretical and Computational Biophysics
> Am Fassberg 11, 37077 Goettingen, Germany
> Tel. +49-551-2012313, Fax: +49-551-2012302
> http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne
>
>
>
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>

-- 
Jeff Squyres
jsquyres_at_[hidden]
For corporate legal information go to:
http://www.cisco.com/web/about/doing_business/legal/cri/