Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Error when attempting to run LAMMPS on Centos 6.2 with OpenMPI
From: Ralph Castain (rhc_at_[hidden])
Date: 2013-01-25 22:26:02


No earthly idea, I'm afraid - you might check "which mpirun" to ensure you are indeed running the one you think. You might also try installing OMPI from scratch yourself (into a directory under your home) so you can ensure it actually is built to match your system. You could then verify the result by building and running the code in the "examples" directory.

Not sure what else I can suggest.

On Jan 25, 2013, at 7:07 PM, #YEO JINGJIE# <JYEO1_at_[hidden]> wrote:

> Does not seem to work. I did
>
> export PATH=/usr/local/openmpi/bin:$PATH
> export LD_LIBRARY_PATH=/usr/local/openmpi/lib:$LD_LIBRARY_PATH
>
>
> echo $PATH
> /usr/lib64/openmpi/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:
> echo $LD_LIBRARY_PATH
> /usr/lib64/openmpi/lib:
>
> But I still get the same error.
>
> Regards,
> Jingjie Yeo
> Ph.D. Student
> School of Mechanical and Aerospace Engineering
> Nanyang Technological University, Singapore
> From: users-bounces_at_[hidden] [users-bounces_at_[hidden]] on behalf of Ralph Castain [rhc_at_[hidden]]
> Sent: Saturday, 26 January, 2013 10:29:13 AM
> To: Open MPI Users
> Subject: Re: [OMPI users] Error when attempting to run LAMMPS on Centos 6.2 with OpenMPI
>
> Okay - that's why I don't like package installs...
>
> Still, it should work. One thing I don't like is that you have /usr/lib64/openmpi/bin at the end of your PATH - that's a bad idea. Most OS's come with a preinstalled version, usually something really old, so you want the desired version of OMPI to be at the front of your path.
>
> Do you have LD_LIBRARY_PATH set as well? Again, you need /usr/lib64/openmpi/lib set at the very front of that envar.
>
>
> On Jan 25, 2013, at 6:22 PM, #YEO JINGJIE# <JYEO1_at_[hidden]> wrote:
>
>> Ok this was what I got:
>>
>> which srun
>>
>> /usr/bin/which: no srun in (/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/usr/lib64/openmpi/bin:)
>>
>> ls -R /usr/lib64/openmpi
>>
>> /usr/lib64/openmpi:
>> bin lib share
>> /usr/lib64/openmpi/bin:
>> mpic++ mpif90 opal_wrapper orte-top vtc++
>> mpicc mpif90-vt opari orte_wrapper_script vtcc
>> mpiCC mpirun orte-bootproxy.sh otfaux vtCC
>> mpicc-vt ompi-clean ortec++ otfcompress vtcxx
>> mpiCC-vt ompi_info ortecc otfconfig vtf77
>> mpic++-vt ompi-iof orteCC otfdecompress vtf90
>> mpicxx ompi-probe orte-clean otfdump vtfilter
>> mpicxx-vt ompi-profiler orted otfinfo vtunify
>> mpiexec ompi-ps orte-iof otfmerge vtunify-mpi
>> mpif77 ompi-server orte-ps otfprofile vtwrapper
>> mpif77-vt ompi-top orterun otfshrink
>> /usr/lib64/openmpi/lib:
>> libmca_common_sm.so libompitrace.so.0 libvt-hyb.so.0.0.0
>> libmca_common_sm.so.2 libompitrace.so.0.0.0 libvt-mpi.a
>> libmca_common_sm.so.2.0.0 libopen-pal.so libvt-mpi.so
>> libmpi_cxx.so libopen-pal.so.3 libvt-mpi.so.0
>> libmpi_cxx.so.1 libopen-pal.so.3.0.0 libvt-mpi.so.0.0.0
>> libmpi_cxx.so.1.0.1 libopen-rte.so libvt-mt.a
>> libmpi_f77.so libopen-rte.so.3 libvt-mt.so
>> libmpi_f77.so.1 libopen-rte.so.3.0.0 libvt-mt.so.0
>> libmpi_f77.so.1.0.2 libotf.a libvt-mt.so.0.0.0
>> libmpi_f90.so libotf.so libvt-pomp.a
>> libmpi_f90.so.1 libotf.so.0 libvt.so
>> libmpi_f90.so.1.1.0 libotf.so.0.0.1 libvt.so.0
>> libmpi.so libvt.a libvt.so.0.0.0
>> libmpi.so.1 libvt-hyb.a mpi.mod
>> libmpi.so.1.0.2 libvt-hyb.so openmpi
>> libompitrace.so libvt-hyb.so.0
>> /usr/lib64/openmpi/lib/openmpi:
>> libompi_dbg_msgq.so mca_mtl_psm.so
>> mca_allocator_basic.so mca_notifier_command.so
>> mca_allocator_bucket.so mca_notifier_smtp.so
>> mca_bml_r2.so mca_notifier_syslog.so
>> mca_btl_ofud.so mca_odls_default.so
>> mca_btl_openib.so mca_oob_tcp.so
>> mca_btl_self.so mca_osc_pt2pt.so
>> mca_btl_sm.so mca_osc_rdma.so
>> mca_btl_tcp.so mca_paffinity_hwloc.so
>> mca_carto_auto_detect.so mca_plm_rshd.so
>> mca_carto_file.so mca_plm_rsh.so
>> mca_coll_basic.so mca_plm_slurm.so
>> mca_coll_hierarch.so mca_pml_bfo.so
>> mca_coll_inter.so mca_pml_cm.so
>> mca_coll_self.so mca_pml_csum.so
>> mca_coll_sm.so mca_pml_ob1.so
>> mca_coll_sync.so mca_pml_v.so
>> mca_coll_tuned.so mca_pstat_linux.so
>> mca_crs_none.so mca_pubsub_orte.so
>> mca_debugger_mpir.so mca_ras_cm.so
>> mca_debugger_mpirx.so mca_ras_gridengine.so
>> mca_dpm_orte.so mca_ras_loadleveler.so
>> mca_errmgr_default.so mca_ras_slurm.so
>> mca_ess_env.so mca_rcache_vma.so
>> mca_ess_hnp.so mca_rmaps_load_balance.so
>> mca_ess_singleton.so mca_rmaps_rank_file.so
>> mca_ess_slave.so mca_rmaps_resilient.so
>> mca_ess_slurmd.so mca_rmaps_round_robin.so
>> mca_ess_slurm.so mca_rmaps_seq.so
>> mca_ess_tool.so mca_rmaps_topo.so
>> mca_filem_rsh.so mca_rmcast_tcp.so
>> mca_grpcomm_bad.so mca_rmcast_udp.so
>> mca_grpcomm_basic.so mca_rml_oob.so
>> mca_grpcomm_hier.so mca_routed_binomial.so
>> mca_iof_hnp.so mca_routed_cm.so
>> mca_iof_orted.so mca_routed_direct.so
>> mca_iof_tool.so mca_routed_linear.so
>> mca_maffinity_first_use.so mca_routed_radix.so
>> mca_maffinity_libnuma.so mca_routed_slave.so
>> mca_mpool_fake.so mca_sysinfo_linux.so
>> mca_mpool_rdma.so mca_topo_unity.so
>> mca_mpool_sm.so mca_vprotocol_pessimist.so
>> /usr/lib64/openmpi/share:
>> openmpi vampirtrace
>> /usr/lib64/openmpi/share/openmpi:
>> amca-param-sets help-orte-filem-base.txt
>> doc help-orte-filem-rsh.txt
>> help-coll-sync.txt help-orte-iof.txt
>> help-dash-host.txt help-orte-notifier-command.txt
>> help-ess-base.txt help-orte-notifier-smtp.txt
>> help-hostfile.txt help-orte-odls-base.txt
>> help-mca-base.txt help-orte-ps.txt
>> help-mca-bml-r2.txt help-orte-rmaps-base.txt
>> help-mca-coll-base.txt help-orte-rmaps-lb.txt
>> help-mca-op-base.txt help-orte-rmaps-resilient.txt
>> help-mca-param.txt help-orte-rmaps-rr.txt
>> help-mpi-api.txt help-orte-rmaps-seq.txt
>> help-mpi-btl-base.txt help-orte-rmaps-topo.txt
>> help-mpi-btl-openib-cpc-base.txt help-orte-rmcast-udp.txt
>> help-mpi-btl-openib-cpc-rdmacm.txt help-orte-runtime.txt
>> help-mpi-btl-openib.txt help-orterun.txt
>> help-mpi-btl-sm.txt help-orte-snapc-base.txt
>> help-mpi-btl-tcp.txt help-orte-top.txt
>> help-mpi-coll-sm.txt help-plm-base.txt
>> help-mpi-common-sm.txt help-plm-rshd.txt
>> help-mpi-errors.txt help-plm-rsh.txt
>> help-mpi-pml-bfo.txt help-plm-slurm.txt
>> help-mpi-pml-csum.txt help-ras-base.txt
>> help-mpi-pml-ob1.txt help-ras-gridengine.txt
>> help-mpi-runtime.txt help-ras-slurm.txt
>> help-mpool-base.txt help-regex.txt
>> help-mtl-psm.txt help-rmaps_rank_file.txt
>> help-odls-default.txt help-rmcast-base.txt
>> help-ompi-crcp-base.txt mca-btl-openib-device-params.ini
>> help-ompi-dpm-base.txt mpicc-vt-wrapper-data.txt
>> help-ompi-dpm-orte.txt mpiCC-vt-wrapper-data.txt
>> help-ompi_info.txt mpicc-wrapper-data.txt
>> help-ompi-probe.txt mpiCC-wrapper-data.txt
>> help-ompi-profiler.txt mpic++-vt-wrapper-data.txt
>> help-ompi-pubsub-orte.txt mpic++-wrapper-data.txt
>> help-ompi-server.txt mpicxx-vt-wrapper-data.txt
>> help-oob-tcp.txt mpicxx-wrapper-data.txt
>> help-opal-carto-file.txt mpif77-vt-wrapper-data.txt
>> help-opal-crs-base.txt mpif77-wrapper-data.txt
>> help-opal-crs-none.txt mpif90-vt-wrapper-data.txt
>> help-opal-memory-linux.txt mpif90-wrapper-data.txt
>> help-opal-runtime.txt openmpi-valgrind.supp
>> help-opal-util.txt ortecc-wrapper-data.txt
>> help-opal-wrapper.txt orteCC-wrapper-data.txt
>> help-orte-clean.txt ortec++-wrapper-data.txt
>> help-orted.txt
>> /usr/lib64/openmpi/share/openmpi/amca-param-sets:
>> btl-openib-benchmark example.conf
>> /usr/lib64/openmpi/share/openmpi/doc:
>> COPYRIGHT-ptmalloc2.txt
>> /usr/lib64/openmpi/share/vampirtrace:
>> doc METRICS.SPEC vtcxx-wrapper-data.txt
>> FILTER.SPEC vtcc-wrapper-data.txt vtf77-wrapper-data.txt
>> GROUPS.SPEC vtCC-wrapper-data.txt vtf90-wrapper-data.txt
>> libtool vtc++-wrapper-data.txt
>> /usr/lib64/openmpi/share/vampirtrace/doc:
>> ChangeLog LICENSE opari otf UserManual.html UserManual.pdf
>> /usr/lib64/openmpi/share/vampirtrace/doc/opari:
>> ChangeLog lacsi01.pdf LICENSE opari-logo-100.gif Readme.html
>> /usr/lib64/openmpi/share/vampirtrace/doc/otf:
>> ChangeLog LICENSE otftools.pdf specification.pdf
>>
>> Regards,
>> Jingjie Yeo
>> Ph.D. Student
>> School of Mechanical and Aerospace Engineering
>> Nanyang Technological University, Singapore
>> From: users-bounces_at_[hidden] [users-bounces_at_[hidden]] on behalf of Ralph Castain [rhc_at_[hidden]]
>> Sent: Saturday, 26 January, 2013 10:17:25 AM
>> To: Open MPI Users
>> Subject: Re: [OMPI users] Error when attempting to run LAMMPS on Centos 6.2 with OpenMPI
>>
>> Hmmm...looks like it was built with Slurm support - is your cluster running Slurm? Do you see an "srun" command - e.g., if you do "which srun", what do you get?
>>
>> You should also do an "ls -R /usr/lib64/openmpi" and see what modules were installed. Send that along and let's see why it didn't find anything.
>>
>>
>> On Jan 25, 2013, at 6:05 PM, #YEO JINGJIE# <JYEO1_at_[hidden]> wrote:
>>
>>> Yes sir here it is:
>>>
>>>
>>> Package: Open MPI mockbuild_at_[hidden]
>>> Distribution
>>> Open MPI: 1.5.4
>>> Open MPI SVN revision: r25060
>>> Open MPI release date: Aug 18, 2011
>>> Open RTE: 1.5.4
>>> Open RTE SVN revision: r25060
>>> Open RTE release date: Aug 18, 2011
>>> OPAL: 1.5.4
>>> OPAL SVN revision: r25060
>>> OPAL release date: Aug 18, 2011
>>> Ident string: 1.5.4
>>> Prefix: /usr/lib64/openmpi
>>> Configured architecture: x86_64-unknown-linux-gnu
>>> Configure host: c6b8.bsys.dev.centos.org
>>> Configured by: mockbuild
>>> Configured on: Fri Jun 22 06:42:03 UTC 2012
>>> Configure host: c6b8.bsys.dev.centos.org
>>> Built by: mockbuild
>>> Built on: Fri Jun 22 06:46:48 UTC 2012
>>> Built host: c6b8.bsys.dev.centos.org
>>> C bindings: yes
>>> C++ bindings: yes
>>> Fortran77 bindings: yes (all)
>>> Fortran90 bindings: yes
>>> Fortran90 bindings size: small
>>> C compiler: gcc
>>> C compiler absolute: /usr/bin/gcc
>>> C compiler family name: GNU
>>> C compiler version: 4.4.6
>>> C++ compiler: g++
>>> C++ compiler absolute: /usr/bin/g++
>>> Fortran77 compiler: gfortran
>>> Fortran77 compiler abs: /usr/bin/gfortran
>>> Fortran90 compiler: gfortran
>>> Fortran90 compiler abs: /usr/bin/gfortran
>>> C profiling: yes
>>> C++ profiling: yes
>>> Fortran77 profiling: yes
>>> Fortran90 profiling: yes
>>> C++ exceptions: no
>>> Thread support: posix (MPI_THREAD_MULTIPLE: no, progress: no)
>>> Sparse Groups: no
>>> Internal debug support: no
>>> MPI interface warnings: no
>>> MPI parameter check: runtime
>>> Memory profiling support: no
>>> Memory debugging support: no
>>> libltdl support: yes
>>> Heterogeneous support: no
>>> mpirun default --prefix: no
>>> MPI I/O support: yes
>>> MPI_WTIME support: gettimeofday
>>> Symbol vis. support: yes
>>> MPI extensions: affinity example
>>> FT Checkpoint support: no (checkpoint thread: no)
>>> MPI_MAX_PROCESSOR_NAME: 256
>>> MPI_MAX_ERROR_STRING: 256
>>> MPI_MAX_OBJECT_NAME: 64
>>> MPI_MAX_INFO_KEY: 36
>>> MPI_MAX_INFO_VAL: 256
>>> MPI_MAX_PORT_NAME: 1024
>>> MPI_MAX_DATAREP_STRING: 128
>>> MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA memchecker: valgrind (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA memory: linux (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA paffinity: hwloc (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA carto: auto_detect (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA carto: file (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA maffinity: first_use (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA maffinity: libnuma (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA timer: linux (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA installdirs: env (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA installdirs: config (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA dpm: orte (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA allocator: basic (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA coll: basic (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA coll: inter (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA coll: self (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA coll: sm (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA coll: sync (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA coll: tuned (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA mpool: fake (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA mpool: rdma (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA mpool: sm (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA pml: bfo (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA pml: csum (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA pml: v (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA bml: r2 (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA rcache: vma (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA btl: ofud (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA btl: openib (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA btl: self (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA btl: sm (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA btl: tcp (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA topo: unity (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA osc: rdma (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA iof: hnp (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA iof: orted (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA iof: tool (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA oob: tcp (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA odls: default (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ras: cm (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ras: gridengine (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ras: loadleveler (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ras: slurm (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA rmaps: load_balance (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA rmaps: resilient (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA rmaps: topo (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA rml: oob (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA routed: binomial (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA routed: cm (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA routed: direct (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA routed: linear (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA routed: radix (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA routed: slave (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA plm: rsh (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA plm: rshd (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA plm: slurm (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA filem: rsh (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA errmgr: default (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ess: env (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ess: hnp (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ess: singleton (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ess: slave (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ess: slurm (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ess: slurmd (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA ess: tool (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA grpcomm: basic (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA grpcomm: hier (MCA v2.0, API v2.0, Component v1.5.4)
>>> MCA notifier: command (MCA v2.0, API v1.0, Component v1.5.4)
>>> MCA notifier: smtp (MCA v2.0, API v1.0, Component v1.5.4)
>>> MCA notifier: syslog (MCA v2.0, API v1.0, Component v1.5.4)
>>>
>>> Regards,
>>> Jingjie Yeo
>>> Ph.D. Student
>>> School of Mechanical and Aerospace Engineering
>>> Nanyang Technological University, Singapore
>>> From: users-bounces_at_[hidden] [users-bounces_at_[hidden]] on behalf of Ralph Castain [rhc_at_[hidden]]
>>> Sent: Saturday, 26 January, 2013 9:58:04 AM
>>> To: Open MPI Users
>>> Subject: Re: [OMPI users] Error when attempting to run LAMMPS on Centos 6.2 with OpenMPI
>>>
>>> Groan - I hate these bundled installs :-(
>>>
>>> If you do "ompi_info", what does it tell you? Could you please send it along?
>>>
>>> Thanks
>>>
>>> On Jan 25, 2013, at 5:51 PM, #YEO JINGJIE# <JYEO1_at_[hidden]> wrote:
>>>
>>>> I tried to follow the installation instructions over here:
>>>>
>>>> http://amusecode.org/doc/install/install-prerequisites-redhat.html
>>>>
>>>> And I am using bash and yum.
>>>>
>>>> Regards,
>>>> Jingjie Yeo
>>>> Ph.D. Student
>>>> School of Mechanical and Aerospace Engineering
>>>> Nanyang Technological University, Singapore
>>>> _______________________________________________
>>>> users mailing list
>>>> users_at_[hidden]
>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>
>>> _______________________________________________
>>> users mailing list
>>> users_at_[hidden]
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users