Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

From: Peng Wang (pewang_at_[hidden])
Date: 2006-08-03 16:04:46


Hi, there:

How are you guys doing?

I got a seg fault at the beginning of the Aztec library
an application is using, please find the attached config.log.gz
and the following stack trace and ompi_info output:

The following stack trace repeats for each process:
====

Failing at addr:(nil)
Signal:11 info.si_errno:0(Success) si_code:128()
Failing at addr:(nil)
[0] func:pgeofe [0x606665]
[1] func:/lib64/tls/libpthread.so.0 [0x3f8460c420]
[2] func:pgeofe(MPI_Comm_size+0x4d) [0x549c5d]
[3] func:pgeofe(parallel_info+0x3e) [0x4fcffe]
[4] func:pgeofe(AZ_set_proc_config+0x34) [0x4e8594]
[5] func:pgeofe(MAIN__+0xb2) [0x44ba8a]
[6] func:pgeofe(main+0x32) [0x4420aa]
[7] func:/lib64/tls/libc.so.6(__libc_start_main+0xdb) [0x3f83b1c4bb]
[8] func:pgeofe [0x441fea]
*** End of error message ***

====

ompi_info output:

                 Open MPI: 1.1
    Open MPI SVN revision: r10477
                 Open RTE: 1.1
    Open RTE SVN revision: r10477
                     OPAL: 1.1
        OPAL SVN revision: r10477
                   Prefix: /home/pewang/openmpi
  Configured architecture: x86_64-unknown-linux-gnu
            Configured by: pewang
            Configured on: Thu Aug 3 14:22:22 EDT 2006
           Configure host: bl-geol-karst.geology.indiana.edu
                 Built by: pewang
                 Built on: Thu Aug 3 14:34:17 EDT 2006
               Built host: bl-geol-karst.geology.indiana.edu
               C bindings: yes
             C++ bindings: yes
       Fortran77 bindings: yes (all)
       Fortran90 bindings: yes
  Fortran90 bindings size: small
               C compiler: gcc
      C compiler absolute: /usr/bin/gcc
             C++ compiler: g++
    C++ compiler absolute: /usr/bin/g++
       Fortran77 compiler: ifort
   Fortran77 compiler abs: /opt/intel/fce/9.0/bin/ifort
       Fortran90 compiler: ifort
   Fortran90 compiler abs: /opt/intel/fce/9.0/bin/ifort
              C profiling: yes
            C++ profiling: yes
      Fortran77 profiling: yes
      Fortran90 profiling: yes
           C++ exceptions: no
           Thread support: posix (mpi: yes, progress: no)
   Internal debug support: no
      MPI parameter check: runtime
Memory profiling support: no
Memory debugging support: no
          libltdl support: yes
               MCA memory: ptmalloc2 (MCA v1.0, API v1.0, Component v1.1)
            MCA paffinity: linux (MCA v1.0, API v1.0, Component v1.1)
            MCA maffinity: first_use (MCA v1.0, API v1.0, Component v1.1)
            MCA maffinity: libnuma (MCA v1.0, API v1.0, Component v1.1)
                MCA timer: linux (MCA v1.0, API v1.0, Component v1.1)
            MCA allocator: basic (MCA v1.0, API v1.0, Component v1.0)
            MCA allocator: bucket (MCA v1.0, API v1.0, Component v1.0)
                 MCA coll: basic (MCA v1.0, API v1.0, Component v1.1)
                 MCA coll: hierarch (MCA v1.0, API v1.0, Component v1.1)
                 MCA coll: self (MCA v1.0, API v1.0, Component v1.1)
                 MCA coll: sm (MCA v1.0, API v1.0, Component v1.1)
                 MCA coll: tuned (MCA v1.0, API v1.0, Component v1.1)
                MCA mpool: sm (MCA v1.0, API v1.0, Component v1.1)
                  MCA pml: ob1 (MCA v1.0, API v1.0, Component v1.1)
                  MCA bml: r2 (MCA v1.0, API v1.0, Component v1.1)
               MCA rcache: rb (MCA v1.0, API v1.0, Component v1.1)
                  MCA btl: self (MCA v1.0, API v1.0, Component v1.1)
                  MCA btl: sm (MCA v1.0, API v1.0, Component v1.1)
                  MCA btl: tcp (MCA v1.0, API v1.0, Component v1.0)
                 MCA topo: unity (MCA v1.0, API v1.0, Component v1.1)
                  MCA osc: pt2pt (MCA v1.0, API v1.0, Component v1.0)
                  MCA gpr: null (MCA v1.0, API v1.0, Component v1.1)
                  MCA gpr: proxy (MCA v1.0, API v1.0, Component v1.1)
                  MCA gpr: replica (MCA v1.0, API v1.0, Component v1.1)
                  MCA iof: proxy (MCA v1.0, API v1.0, Component v1.1)
                  MCA iof: svc (MCA v1.0, API v1.0, Component v1.1)
                   MCA ns: proxy (MCA v1.0, API v1.0, Component v1.1)
                   MCA ns: replica (MCA v1.0, API v1.0, Component v1.1)
                  MCA oob: tcp (MCA v1.0, API v1.0, Component v1.0)
                  MCA ras: dash_host (MCA v1.0, API v1.0, Component v1.1)
                  MCA ras: hostfile (MCA v1.0, API v1.0, Component v1.1)
                  MCA ras: localhost (MCA v1.0, API v1.0, Component v1.1)
                  MCA ras: slurm (MCA v1.0, API v1.0, Component v1.1)
                  MCA rds: hostfile (MCA v1.0, API v1.0, Component v1.1)
                  MCA rds: resfile (MCA v1.0, API v1.0, Component v1.1)
                MCA rmaps: round_robin (MCA v1.0, API v1.0, Component v1.1)
                 MCA rmgr: proxy (MCA v1.0, API v1.0, Component v1.1)
                 MCA rmgr: urm (MCA v1.0, API v1.0, Component v1.1)
                  MCA rml: oob (MCA v1.0, API v1.0, Component v1.1)
                  MCA pls: fork (MCA v1.0, API v1.0, Component v1.1)
                  MCA pls: rsh (MCA v1.0, API v1.0, Component v1.1)
                  MCA pls: slurm (MCA v1.0, API v1.0, Component v1.1)
                  MCA sds: env (MCA v1.0, API v1.0, Component v1.1)
                  MCA sds: seed (MCA v1.0, API v1.0, Component v1.1)
                  MCA sds: singleton (MCA v1.0, API v1.0, Component v1.1)
                  MCA sds: pipe (MCA v1.0, API v1.0, Component v1.1)
                  MCA sds: slurm (MCA v1.0, API v1.0, Component v1.1)

Thanks,
Peng

===============================================
Peng Wang, HPC/RAC
UITS, Indiana University
(812)855-9916 http://www.indiana.edu/~rac/hpc
===============================================