Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] unknow option "--tree-spawn" with OpenMPI-1.7.1
From: Zehan Cui (zehan.cui_at_[hidden])
Date: 2013-06-14 12:33:39


Thanks.

That's exactly the problem. When add prefix to the mpirun command,
everything goes fine.

- Zehan Cui

On Fri, Jun 14, 2013 at 10:25 PM, Jeff Squyres (jsquyres) <
jsquyres_at_[hidden]> wrote:

> Check the PATH you get when you run non-interactively on the remote
> machine:
>
> ssh gnode100 env | grep PATH
>
>
> On Jun 14, 2013, at 10:09 AM, Zehan Cui <zehan.cui_at_[hidden]> wrote:
>
> > I think the PATH setting is ok. I forgot to mention that it run well on
> local machine.
> >
> > The PATH setting on the local machine is
> >
> > [cmy_at_gLoginNode1 ~]$ echo $PATH
> >
> /home/cmy/clc/benchmarks/nasm-2.09.10:/home3/cmy/czh/opt/ompi-1.7.1/bin/:/home3/cmy/czh/opt/autoconf-2.69/bin/:/home3/cmy/czh/opt/mvapich2-1.9/bin/:/home/cmy/wr/local/ft-mvapich2-1.8a2/bin:/home/cmy/wr/local/mvapich2-1.8a2/bin:/usr/mpi/gcc/mvapich2-1.4.1/bin:/home3/cmy/czh/ompi/bin/:/home/cmy/huangyb/gem5/gcc/gcc-4.3/bin:/home/cmy/huangyb/gem5/swig/bin/:/home/cmy/huangyb/gem5/scons/bin::/home/cmy/huangyb/local/mercurial/bin:/home/cmy/huangyb/local/python-2.7.3/bin/:/home/SOFT/intel/Compiler/11.0/083/bin/intel64:/usr/mpi/gcc/openmpi-1.4.2/bin/:/home/SOFT/intel/Compiler/11.0/083/bin/intel64:/home/cmy/tgm/cmake/bin:/usr/local/mvapich2/bin:/usr/local/mpich-pgi/bin:/opt/pgi/linux86-64/7.0-2/bin:/usr/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/opt/gridviewnew/pbs//dispatcher-sched//bin:/opt/gridviewnew/pbs//dispatcher-sched//sbin:/opt/gridviewnew/pbs//dispatcher//bin:/opt/gridviewnew/pbs//dispatcher//sbin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/home/cmy/zxx/work_spring_2011/iaca-lin32/bin:/home/cmy/bin:/home/tgm/ljj/software/dmidecode-2.11/:/usr/local/oski_2007/include
> > [cmy_at_gLoginNode1 ~]$ echo $LD_LIBRARY_PATH
> >
> /home3/cmy/czh/opt/ompi-1.7.1/lib/:/home3/cmy/czh/opt/mvapich2-1.9/lib/:/home/cmy/wr/local/ft-mvapich2-1.8a2/lib:/home/cmy/wr/local/mvapich2-1.8a2/lib:/usr/mpi/gcc/mvapich2-1.4.1/lib:/home3/cmy/czh/ompi/lib/:/home/cmy/huangyb/gem5/gcc/gcc-4.3/lib64:/home/cmy/huangyb/gem5/gcc/gcc-4.3/lib/:/home/cmy/huangyb/local/python-2.7.3/lib/:/usr/local/lib64:/usr/local/lib:/home/cmy/clc/DRAMSim2:/home/SOFT/intel/Compiler/11.0/083/lib/intel64:/home/cmy/zxx/oski-icc/lib/oski:/usr/mpi/gcc/openmpi-1.4.2/lib/:/usr/lib/python2.4/config:/home/SOFT/intel/Compiler/11.0/083/mkl/lib/em64t:/home/cmy/tgm/hpx/build/linux/lib:/home/cmy/yanjie/boost/lib:/usr/local/mvapich2/lib:/home/cmy/yanjie/qthread/lib:/opt/gridviewnew/pbs//dispatcher//lib::/usr/local/lib64:/usr/local/lib:/home/cmy/zxx/work_spring_2011/iaca-lin32/lib
> >
> >
> > The path setting on gnode100 is the same too
> >
> > [cmy_at_gnode100 ~]$
> > [cmy_at_gnode100 ~]$ echo $PATH
> >
> /home/cmy/clc/benchmarks/nasm-2.09.10:/home3/cmy/czh/opt/ompi-1.7.1/bin/:/home3/cmy/czh/opt/autoconf-2.69/bin/:/home3/cmy/czh/opt/mvapich2-1.9/bin/:/home/cmy/wr/local/ft-mvapich2-1.8a2/bin:/home/cmy/wr/local/mvapich2-1.8a2/bin:/usr/mpi/gcc/mvapich2-1.4.1/bin:/home3/cmy/czh/ompi/bin/:/home/cmy/huangyb/gem5/gcc/gcc-4.3/bin:/home/cmy/huangyb/gem5/swig/bin/:/home/cmy/huangyb/gem5/scons/bin::/home/cmy/huangyb/local/mercurial/bin:/home/cmy/huangyb/local/python-2.7.3/bin/:/home/SOFT/intel/Compiler/11.0/083/bin/intel64:/usr/mpi/gcc/openmpi-1.4.2/bin/:/home/SOFT/intel/Compiler/11.0/083/bin/intel64:/home/cmy/tgm/cmake/bin:/usr/local/mvapich2/bin:/usr/local/mpich-pgi/bin:/opt/pgi/linux86-64/7.0-2/bin:/usr/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/opt/gridviewnew/pbs//dispatcher-sched//bin:/opt/gridviewnew/pbs//dispatcher-sched//sbin:/opt/gridviewnew/pbs//dispatcher//bin:/opt/gridviewnew/pbs//dispatcher//sbin:/usr/local/bin:/bin:/usr/bin:/home/cmy/zxx/work_spring_2011/iaca-lin32/bin:/home/cmy/bin:/home/tgm/ljj/software/dmidecode-2.11/:/usr/local/oski_2007/include
> > [cmy_at_gnode100 ~]$
> > [cmy_at_gnode100 ~]$ echo $LD_LIBRARY_PATH
> >
> /home3/cmy/czh/opt/ompi-1.7.1/lib/:/home3/cmy/czh/opt/mvapich2-1.9/lib/:/home/cmy/wr/local/ft-mvapich2-1.8a2/lib:/home/cmy/wr/local/mvapich2-1.8a2/lib:/usr/mpi/gcc/mvapich2-1.4.1/lib:/home3/cmy/czh/ompi/lib/:/home/cmy/huangyb/gem5/gcc/gcc-4.3/lib64:/home/cmy/huangyb/gem5/gcc/gcc-4.3/lib/:/home/cmy/huangyb/local/python-2.7.3/lib/:/usr/local/lib64:/usr/local/lib:/home/cmy/clc/DRAMSim2:/home/SOFT/intel/Compiler/11.0/083/lib/intel64:/home/cmy/zxx/oski-icc/lib/oski:/usr/mpi/gcc/openmpi-1.4.2/lib/:/usr/lib/python2.4/config:/home/SOFT/intel/Compiler/11.0/083/mkl/lib/em64t:/home/cmy/tgm/hpx/build/linux/lib:/home/cmy/yanjie/boost/lib:/usr/local/mvapich2/lib:/home/cmy/yanjie/qthread/lib:/opt/gridviewnew/pbs//dispatcher//lib::/usr/local/lib64:/usr/local/lib:/home/cmy/zxx/work_spring_2011/iaca-lin32/lib
> > [cmy_at_gnode100 ~]$
> >
> > Best Regards
> > Zehan Cui(崔泽汉)
> > -----------------------------------------------------------
> > Institute of Computing Technology, Chinese Academy of Sciences.
> > No.6 Kexueyuan South Road Zhongguancun,Haidian District Beijing,China
> >
> >
> >
> > On Fri, Jun 14, 2013 at 9:32 PM, Ralph Castain <rhc_at_[hidden]> wrote:
> > You aren't setting the path correctly on your backend machines, and so
> they are picking up an older version of OMPI.
> >
> > On Jun 14, 2013, at 2:08 AM, Zehan Cui <zehan.cui_at_[hidden]> wrote:
> >
> > > Hi,
> > >
> > > I have just install OpenMPI-1.7.1 and cannot get it running.
> > >
> > > Here is the error messages:
> > >
> > > [cmy_at_gLoginNode1 test_nbc]$ mpirun -n 4 -host gnode100 ./hello
> > > [gnode100:31789] Error: unknown option "--tree-spawn"
> > > input in flex scanner failed
> > > [gLoginNode1:14920] [[62542,0],0] ORTE_ERROR_LOG: A message is
> attempting to be sent to a process whose contact information is unknown in
> file rml_oob_send.c at line 362
> > > [gLoginNode1:14920] [[62542,0],0] attempted to send to [[62542,0],1]:
> tag 15
> > > [gLoginNode1:14920] [[62542,0],0] ORTE_ERROR_LOG: A message is
> attempting to be sent to a process whose contact information is unknown in
> file base/grpcomm_base_xcast.c at line 166
> > >
> > > I have run it on several nodes, and got the same messages.
> > >
> > >
> > > - Zehan Cui
> > >
> > >
> > > _______________________________________________
> > > users mailing list
> > > users_at_[hidden]
> > > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
>
>
> --
> Jeff Squyres
> jsquyres_at_[hidden]
> For corporate legal information go to:
> http://www.cisco.com/web/about/doing_business/legal/cri/
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>