Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] openmpi 1.2.9 with Xgrid support
From: Ricardo Fernández-Perea (rfernandezperea_at_[hidden])
Date: 2009-02-24 13:52:11


Hi.Due that xgrid support is broken at the moment in 1.3, I am trying to
install 1.2.9 in a xserve cluster.

I am using the gcc compilers downloaded from http://hpc.sourceforge.net/.

To be sure to not mixing compiler I am using the following configure

./configure --prefix=/opt/openmpi CC=/usr/local/bin/gcc
CXX=/usr/local/bin/g++ 2>&1 |tee config.out

The
$ompi_info
result in

                Open MPI: 1.2.9
   Open MPI SVN revision: r20259
                Open RTE: 1.2.9
   Open RTE SVN revision: r20259
                    OPAL: 1.2.9
       OPAL SVN revision: r20259
                  Prefix: /opt/openmpi
 Configured architecture: i386-apple-darwin9.6.0
           Configured by: sofhtest
           Configured on: Tue Feb 24 18:24:59 CET 2009
          Configure host: nexus10.nlroc
                Built by: sofhtest
                Built on: Tue Feb 24 18:31:38 CET 2009
              Built host: nexus10.nlroc
              C bindings: yes
            C++ bindings: yes
      Fortran77 bindings: yes (single underscore)
      Fortran90 bindings: yes
 Fortran90 bindings size: small
              C compiler: /usr/local/bin/gcc
     C compiler absolute: /usr/local/bin/gcc
            C++ compiler: /usr/local/bin/g++
   C++ compiler absolute: /usr/local/bin/g++
      Fortran77 compiler: gfortran
  Fortran77 compiler abs: /usr/local/bin/gfortran
      Fortran90 compiler: gfortran
  Fortran90 compiler abs: /usr/local/bin/gfortran
             C profiling: yes
           C++ profiling: yes
     Fortran77 profiling: yes
     Fortran90 profiling: yes
          C++ exceptions: no
          Thread support: posix (mpi: no, progress: no)
  Internal debug support: no
     MPI parameter check: runtime
Memory profiling support: no
Memory debugging support: no
         libltdl support: yes
   Heterogeneous support: yes
 mpirun default --prefix: no
           MCA backtrace: execinfo (MCA v1.0, API v1.0, Component v1.2.9)
              MCA memory: darwin (MCA v1.0, API v1.0, Component v1.2.9)
           MCA maffinity: first_use (MCA v1.0, API v1.0, Component v1.2.9)
               MCA timer: darwin (MCA v1.0, API v1.0, Component v1.2.9)
         MCA installdirs: env (MCA v1.0, API v1.0, Component v1.2.9)
         MCA installdirs: config (MCA v1.0, API v1.0, Component v1.2.9)
           MCA allocator: basic (MCA v1.0, API v1.0, Component v1.0)
           MCA allocator: bucket (MCA v1.0, API v1.0, Component v1.0)
                MCA coll: basic (MCA v1.0, API v1.0, Component v1.2.9)
                MCA coll: self (MCA v1.0, API v1.0, Component v1.2.9)
                MCA coll: sm (MCA v1.0, API v1.0, Component v1.2.9)
                MCA coll: tuned (MCA v1.0, API v1.0, Component v1.2.9)
                  MCA io: romio (MCA v1.0, API v1.0, Component v1.2.9)
               MCA mpool: rdma (MCA v1.0, API v1.0, Component v1.2.9)
               MCA mpool: sm (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA pml: cm (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA pml: ob1 (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA bml: r2 (MCA v1.0, API v1.0, Component v1.2.9)
              MCA rcache: vma (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA btl: self (MCA v1.0, API v1.0.1, Component v1.2.9)
                 MCA btl: sm (MCA v1.0, API v1.0.1, Component v1.2.9)
                 MCA btl: tcp (MCA v1.0, API v1.0.1, Component v1.0)
                MCA topo: unity (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA osc: pt2pt (MCA v1.0, API v1.0, Component v1.2.9)
              MCA errmgr: hnp (MCA v1.0, API v1.3, Component v1.2.9)
              MCA errmgr: orted (MCA v1.0, API v1.3, Component v1.2.9)
              MCA errmgr: proxy (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA gpr: null (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA gpr: proxy (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA gpr: replica (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA iof: proxy (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA iof: svc (MCA v1.0, API v1.0, Component v1.2.9)
                  MCA ns: proxy (MCA v1.0, API v2.0, Component v1.2.9)
                  MCA ns: replica (MCA v1.0, API v2.0, Component v1.2.9)
                 MCA oob: tcp (MCA v1.0, API v1.0, Component v1.0)
                 MCA ras: dash_host (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA ras: gridengine (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA ras: localhost (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA ras: xgrid (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA rds: hostfile (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA rds: proxy (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA rds: resfile (MCA v1.0, API v1.3, Component v1.2.9)
               MCA rmaps: round_robin (MCA v1.0, API v1.3, Component v1.2.9)
                MCA rmgr: proxy (MCA v1.0, API v2.0, Component v1.2.9)
                MCA rmgr: urm (MCA v1.0, API v2.0, Component v1.2.9)
                 MCA rml: oob (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA pls: gridengine (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA pls: proxy (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA pls: rsh (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA pls: xgrid (MCA v1.0, API v1.3, Component v1.2.9)
                 MCA sds: env (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA sds: pipe (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA sds: seed (MCA v1.0, API v1.0, Component v1.2.9)
                 MCA sds: singleton (MCA v1.0, API v1.0, Component v1.2.9)

The program seems to run correctlly

but the mpirun finish with

2009-02-24 19:21:52.164 mpirun[22068:10b] *** Terminating app due to
uncaught exception 'NSInvalidArgumentException', reason: '***
-[NSKVONotifying_XGConnection<0x216cf0> finalize]: called when collecting
not enabled'
2009-02-24 19:21:52.165 mpirun[22068:10b] Stack: (
    2440954123,
    2435145275,
    2440982557,
    1687435,
    1679648,
    503955,
    379742,
    365359,
    9986
)
Trace/BPT trap

Any idea what I should look for.

NOTE:I am using xgrid with kerberos.

Yours
Ricardo

PD:I hope I make myself understandable , English is not my primary language