Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Error when attempting to run LAMMPS on Centos 6.2 with OpenMPI
From: Ralph Castain (rhc_at_[hidden])
Date: 2013-01-25 21:17:25


Hmmm...looks like it was built with Slurm support - is your cluster running Slurm? Do you see an "srun" command - e.g., if you do "which srun", what do you get?

You should also do an "ls -R /usr/lib64/openmpi" and see what modules were installed. Send that along and let's see why it didn't find anything.

On Jan 25, 2013, at 6:05 PM, #YEO JINGJIE# <JYEO1_at_[hidden]> wrote:

> Yes sir here it is:
>
>
> Package: Open MPI mockbuild_at_[hidden]
> Distribution
> Open MPI: 1.5.4
> Open MPI SVN revision: r25060
> Open MPI release date: Aug 18, 2011
> Open RTE: 1.5.4
> Open RTE SVN revision: r25060
> Open RTE release date: Aug 18, 2011
> OPAL: 1.5.4
> OPAL SVN revision: r25060
> OPAL release date: Aug 18, 2011
> Ident string: 1.5.4
> Prefix: /usr/lib64/openmpi
> Configured architecture: x86_64-unknown-linux-gnu
> Configure host: c6b8.bsys.dev.centos.org
> Configured by: mockbuild
> Configured on: Fri Jun 22 06:42:03 UTC 2012
> Configure host: c6b8.bsys.dev.centos.org
> Built by: mockbuild
> Built on: Fri Jun 22 06:46:48 UTC 2012
> Built host: c6b8.bsys.dev.centos.org
> C bindings: yes
> C++ bindings: yes
> Fortran77 bindings: yes (all)
> Fortran90 bindings: yes
> Fortran90 bindings size: small
> C compiler: gcc
> C compiler absolute: /usr/bin/gcc
> C compiler family name: GNU
> C compiler version: 4.4.6
> C++ compiler: g++
> C++ compiler absolute: /usr/bin/g++
> Fortran77 compiler: gfortran
> Fortran77 compiler abs: /usr/bin/gfortran
> Fortran90 compiler: gfortran
> Fortran90 compiler abs: /usr/bin/gfortran
> C profiling: yes
> C++ profiling: yes
> Fortran77 profiling: yes
> Fortran90 profiling: yes
> C++ exceptions: no
> Thread support: posix (MPI_THREAD_MULTIPLE: no, progress: no)
> Sparse Groups: no
> Internal debug support: no
> MPI interface warnings: no
> MPI parameter check: runtime
> Memory profiling support: no
> Memory debugging support: no
> libltdl support: yes
> Heterogeneous support: no
> mpirun default --prefix: no
> MPI I/O support: yes
> MPI_WTIME support: gettimeofday
> Symbol vis. support: yes
> MPI extensions: affinity example
> FT Checkpoint support: no (checkpoint thread: no)
> MPI_MAX_PROCESSOR_NAME: 256
> MPI_MAX_ERROR_STRING: 256
> MPI_MAX_OBJECT_NAME: 64
> MPI_MAX_INFO_KEY: 36
> MPI_MAX_INFO_VAL: 256
> MPI_MAX_PORT_NAME: 1024
> MPI_MAX_DATAREP_STRING: 128
> MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.5.4)
> MCA memchecker: valgrind (MCA v2.0, API v2.0, Component v1.5.4)
> MCA memory: linux (MCA v2.0, API v2.0, Component v1.5.4)
> MCA paffinity: hwloc (MCA v2.0, API v2.0, Component v1.5.4)
> MCA carto: auto_detect (MCA v2.0, API v2.0, Component v1.5.4)
> MCA carto: file (MCA v2.0, API v2.0, Component v1.5.4)
> MCA maffinity: first_use (MCA v2.0, API v2.0, Component v1.5.4)
> MCA maffinity: libnuma (MCA v2.0, API v2.0, Component v1.5.4)
> MCA timer: linux (MCA v2.0, API v2.0, Component v1.5.4)
> MCA installdirs: env (MCA v2.0, API v2.0, Component v1.5.4)
> MCA installdirs: config (MCA v2.0, API v2.0, Component v1.5.4)
> MCA dpm: orte (MCA v2.0, API v2.0, Component v1.5.4)
> MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.5.4)
> MCA allocator: basic (MCA v2.0, API v2.0, Component v1.5.4)
> MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.5.4)
> MCA coll: basic (MCA v2.0, API v2.0, Component v1.5.4)
> MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.5.4)
> MCA coll: inter (MCA v2.0, API v2.0, Component v1.5.4)
> MCA coll: self (MCA v2.0, API v2.0, Component v1.5.4)
> MCA coll: sm (MCA v2.0, API v2.0, Component v1.5.4)
> MCA coll: sync (MCA v2.0, API v2.0, Component v1.5.4)
> MCA coll: tuned (MCA v2.0, API v2.0, Component v1.5.4)
> MCA mpool: fake (MCA v2.0, API v2.0, Component v1.5.4)
> MCA mpool: rdma (MCA v2.0, API v2.0, Component v1.5.4)
> MCA mpool: sm (MCA v2.0, API v2.0, Component v1.5.4)
> MCA pml: bfo (MCA v2.0, API v2.0, Component v1.5.4)
> MCA pml: csum (MCA v2.0, API v2.0, Component v1.5.4)
> MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.5.4)
> MCA pml: v (MCA v2.0, API v2.0, Component v1.5.4)
> MCA bml: r2 (MCA v2.0, API v2.0, Component v1.5.4)
> MCA rcache: vma (MCA v2.0, API v2.0, Component v1.5.4)
> MCA btl: ofud (MCA v2.0, API v2.0, Component v1.5.4)
> MCA btl: openib (MCA v2.0, API v2.0, Component v1.5.4)
> MCA btl: self (MCA v2.0, API v2.0, Component v1.5.4)
> MCA btl: sm (MCA v2.0, API v2.0, Component v1.5.4)
> MCA btl: tcp (MCA v2.0, API v2.0, Component v1.5.4)
> MCA topo: unity (MCA v2.0, API v2.0, Component v1.5.4)
> MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.5.4)
> MCA osc: rdma (MCA v2.0, API v2.0, Component v1.5.4)
> MCA iof: hnp (MCA v2.0, API v2.0, Component v1.5.4)
> MCA iof: orted (MCA v2.0, API v2.0, Component v1.5.4)
> MCA iof: tool (MCA v2.0, API v2.0, Component v1.5.4)
> MCA oob: tcp (MCA v2.0, API v2.0, Component v1.5.4)
> MCA odls: default (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ras: cm (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ras: gridengine (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ras: loadleveler (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ras: slurm (MCA v2.0, API v2.0, Component v1.5.4)
> MCA rmaps: load_balance (MCA v2.0, API v2.0, Component v1.5.4)
> MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.5.4)
> MCA rmaps: resilient (MCA v2.0, API v2.0, Component v1.5.4)
> MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.5.4)
> MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.5.4)
> MCA rmaps: topo (MCA v2.0, API v2.0, Component v1.5.4)
> MCA rml: oob (MCA v2.0, API v2.0, Component v1.5.4)
> MCA routed: binomial (MCA v2.0, API v2.0, Component v1.5.4)
> MCA routed: cm (MCA v2.0, API v2.0, Component v1.5.4)
> MCA routed: direct (MCA v2.0, API v2.0, Component v1.5.4)
> MCA routed: linear (MCA v2.0, API v2.0, Component v1.5.4)
> MCA routed: radix (MCA v2.0, API v2.0, Component v1.5.4)
> MCA routed: slave (MCA v2.0, API v2.0, Component v1.5.4)
> MCA plm: rsh (MCA v2.0, API v2.0, Component v1.5.4)
> MCA plm: rshd (MCA v2.0, API v2.0, Component v1.5.4)
> MCA plm: slurm (MCA v2.0, API v2.0, Component v1.5.4)
> MCA filem: rsh (MCA v2.0, API v2.0, Component v1.5.4)
> MCA errmgr: default (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ess: env (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ess: hnp (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ess: singleton (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ess: slave (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ess: slurm (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ess: slurmd (MCA v2.0, API v2.0, Component v1.5.4)
> MCA ess: tool (MCA v2.0, API v2.0, Component v1.5.4)
> MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.5.4)
> MCA grpcomm: basic (MCA v2.0, API v2.0, Component v1.5.4)
> MCA grpcomm: hier (MCA v2.0, API v2.0, Component v1.5.4)
> MCA notifier: command (MCA v2.0, API v1.0, Component v1.5.4)
> MCA notifier: smtp (MCA v2.0, API v1.0, Component v1.5.4)
> MCA notifier: syslog (MCA v2.0, API v1.0, Component v1.5.4)
>
> Regards,
> Jingjie Yeo
> Ph.D. Student
> School of Mechanical and Aerospace Engineering
> Nanyang Technological University, Singapore
> From: users-bounces_at_[hidden] [users-bounces_at_[hidden]] on behalf of Ralph Castain [rhc_at_[hidden]]
> Sent: Saturday, 26 January, 2013 9:58:04 AM
> To: Open MPI Users
> Subject: Re: [OMPI users] Error when attempting to run LAMMPS on Centos 6.2 with OpenMPI
>
> Groan - I hate these bundled installs :-(
>
> If you do "ompi_info", what does it tell you? Could you please send it along?
>
> Thanks
>
> On Jan 25, 2013, at 5:51 PM, #YEO JINGJIE# <JYEO1_at_[hidden]> wrote:
>
>> I tried to follow the installation instructions over here:
>>
>> http://amusecode.org/doc/install/install-prerequisites-redhat.html
>>
>> And I am using bash and yum.
>>
>> Regards,
>> Jingjie Yeo
>> Ph.D. Student
>> School of Mechanical and Aerospace Engineering
>> Nanyang Technological University, Singapore
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users