Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] mpirun unsuccessful when run across multiple nodes
From: mohd naseem (naseemshakeel_at_[hidden])
Date: 2011-04-19 10:47:59


sorry sir,

i am unable to understand what u are saying ? becoz i am a new user of mpi.

please tell me details about it and command also

thanks

On Tue, Apr 19, 2011 at 2:32 PM, Reuti <reuti_at_[hidden]> wrote:

> Good, then please supply a hostfile with the names of the machines you want
> to run for a particular run and give it as option to `mpiexec`. See options
> -np and -machinefile.
>
> -- Reuti
>
>
> Am 19.04.2011 um 06:38 schrieb mohd naseem:
>
> > sir
> > when i give mpiexec hostname command.
> > it only give one hostname. rest are not shown.
> >
> >
> >
> >
> >
> >
> > On Mon, Apr 18, 2011 at 7:46 PM, Reuti <reuti_at_[hidden]>
> wrote:
> > Am 18.04.2011 um 15:40 schrieb chenjie gu:
> >
> > > I am a green hand on Openmpi, I have the following Openmpi structure,
> however it has problem when running across multiple nodes.
> > > I am trying to build a Bewolf Cluster between 6 nodes of our serve (HP
> Proliant G460 G7), I have installed the Openmpi on one node (assuming at
> /mirror),
> > > ./configure --prefix=/mirror/openmpi CC=icc CXX=icpc F77=ifort FC=ifort
> > > make all install
> > >
> > > using NFS, the directory of /mirror was successfully exported to the
> rest of 5 nodes. Now as I test the Openmpi, it runs very well on a single
> node,
> > > however it hangs across multiple nodes.
> > >
> > > Now one possible reason as I know is that Openmpi uses TCP to exchange
> data between different nodes, so I am worried about
> > > whether there are firewalls between each nodes, which can be factory
> integrated at somewhere(switch/NIC). Could anyone give me some
> > > information on this point?
> >
> > It's not only about MPI communcation. Before you need some means to allow
> the startup of the local orte daemons on each machine by passphraseless
> ssh-keys or better hostbased authentication
> http://arc.liv.ac.uk/SGE/howto/hostbased-ssh.html , or enable `rsh` on the
> machines and tell Open MPI to use it. Is:
> >
> > mpiexec hostname
> >
> > giving you a list of the involved machines?
> >
> > -- Reuti
> >
> >
> > > Thanks a lot,
> > > Regards,
> > > ArchyGU
> > > Nanyang Technological University
> > > _______________________________________________
> > > users mailing list
> > > users_at_[hidden]
> > > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>