Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Segmentation Fault--libc.so.6(__libc_start_main...
From: Shafagh Jafer (barfy27_at_[hidden])
Date: 2008-09-20 13:14:00


The thing is that my code runs perfectly when I use MPICH, this problem occurs only when i switch to openMPI. :(

--- On Sat, 9/20/08, Aurélien Bouteiller <bouteill_at_[hidden]> wrote:

From: Aurélien Bouteiller <bouteill_at_[hidden]>
Subject: Re: [OMPI users] Segmentation Fault--libc.so.6(__libc_start_main...
To: "Open MPI Users" <users_at_[hidden]>
Date: Saturday, September 20, 2008, 6:54 AM

Shafagh,

You have a segfault in your own code. Because Open MPI detects it, it
forwards the error to you and pretty print it but Open MPI is not the
source of the bug. From the stack trace, I suggest you gdb debug the
PhysicalGetID function.

Aurelien

Le 19 sept. 08 à 22:22, Shafagh Jafer a écrit :

> Hi every one,
> I need urgent help plz :-(
> I am getting the following error when i run my program. OpenMPI
> compilation was all fine and went well, but now i dont understand
> the source of this error:
> ============================================
> [node01:29264] *** Process received signal ***
> [node01:29264] Signal: Segmentation fault (11)
> [node01:29264] Signal code: Address not mapped (1)
> [node01:29264] Failing at address: 0xcf
> [node01:29264] [ 0] /lib/tls/libpthread.so.0 [0x7ccf80]
> [node01:29264] [ 1] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/cd++
> (physicalGetId__C10CommPhyMPI+0x14) [0x8305880]
> [node01:29264] [ 2] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/cd++
> (physicalCommGetId__Fv+0x43) [0x82ff81b]
> [node01:29264] [ 3] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/cd++
> (openComm__16StandAloneLoader+0x1f) [0x80fdf43]
> [node01:29264] [ 4] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/cd++
> (run__21ParallelMainSimulator+0x1640) [0x81ea53c]
> [node01:29264] [ 5] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/cd++
> (main+0xde) [0x80a58ce]
> [node01:29264] [ 6] /lib/tls/libc.so.6(__libc_start_main+0xda)
> [0xe3d79a]
> [node01:29264] [ 7] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/cd++
> (sinh+0x4d) [0x80a2221]
> [node01:29264] *** End of error message ***
> mpirun noticed that job rank 0 with PID 29264 on node node01 exited
> on signal 11 (Segmentation fault).
> ===========================================
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

--
* Dr. Aurélien Bouteiller
* Sr. Research Associate at Innovative Computing Laboratory
* University of Tennessee
* 1122 Volunteer Boulevard, suite 350
* Knoxville, TN 37996
* 865 974 6321
_______________________________________________
users mailing list
users_at_[hidden]
http://www.open-mpi.org/mailman/listinfo.cgi/users