Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Segmentation Fault--libc.so.6(__libc_start_main...
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2008-09-22 11:01:26


On Sep 21, 2008, at 3:46 PM, Shafagh Jafer wrote:

> Yes I am using openmpi mpicc and mpic++ to compile my code,

Are you 100% sure that you're using Open MPI's mpicc / mpic++? (and
not MPICH's) This could be a cause for error.

> and I only have openmpi's lib directory in my LD_LIBRARY_PATH. to
> make sure that I am including the mpi.h of openmpi, i added this
> line to my source code
> #include "/opt/openmpi/1.2.7/include/mpi.h"
> instead of only saying
> #include "mpi.h"
> but now i get the following errors, which show that the wrapper
> compiler is not adding "
> -I${prefix}/include/openmpi" infron of the included files from cxx
> folder.
>
> In file included from /opt/openmpi/1.2.7/include/mpi.h:1795,
> from CommPhyMPI.cc:36:
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:167:
> CommMgrInterface.cc -o /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/
> warped/TimeWarp/src/obj/CommMgrInterface.o
> openmpi/ompi/mpi/cxx/constants.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:168:
> openmpi/ompi/mpi/cxx/functions.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:169:
> openmpi/ompi/mpi/cxx/datatype.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:193:
> openmpi/ompi/mpi/cxx/exception.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:194:
> openmpi/ompi/mpi/cxx/op.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:195:
> openmpi/ompi/mpi/cxx/status.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:196:
> openmpi/ompi/mpi/cxx/request.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:197:
> openmpi/ompi/mpi/cxx/group.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:198:
> openmpi/ompi/mpi/cxx/comm.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:199:
> openmpi/ompi/mpi/cxx/win.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:200:
> openmpi/ompi/mpi/cxx/file.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:201:
> openmpi/ompi/mpi/cxx/errhandler.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:202:
> openmpi/ompi/mpi/cxx/intracomm.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:203:
> openmpi/ompi/mpi/cxx/topology.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:204:
> openmpi/ompi/mpi/cxx/intercomm.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:205:
> openmpi/ompi/mpi/cxx/info.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:244:
> openmpi/ompi/mpi/cxx/datatype_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:245:
> openmpi/ompi/mpi/cxx/functions_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:246:
> openmpi/ompi/mpi/cxx/request_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:247:
> openmpi/ompi/mpi/cxx/comm_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:248:
> openmpi/ompi/mpi/cxx/intracomm_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:249:
> openmpi/ompi/mpi/cxx/topology_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:250:
> openmpi/ompi/mpi/cxx/intercomm_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:251:
> openmpi/ompi/mpi/cxx/group_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:252:
> openmpi/ompi/mpi/cxx/op_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:253:
> openmpi/ompi/mpi/cxx/errhandler_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:254:
> openmpi/ompi/mpi/cxx/status_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:255:
> openmpi/ompi/mpi/cxx/info_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:256:
> openmpi/ompi/mpi/cxx/win_inln.h: No such file or directory
> /opt/openmpi/1.2.7/include/openmpi/ompi/mpi/cxx/mpicxx.h:257:
> openmpi/ompi/mpi/cxx/file_inln.h: No such file or directory
> make[1]: *** [/nfs/sjafer/phd/openMPI/latest_cd++_timewarp/warped/
> TimeWarp/src/obj/CommPhyMPI.o] Error 1
>
>
> --- On Sun, 9/21/08, Aurélien Bouteiller <bouteill_at_[hidden]>
> wrote:
> From: Aurélien Bouteiller <bouteill_at_[hidden]>
> Subject: Re: [OMPI users] Segmentation Fault--libc.so.
> 6(__libc_start_main...
> To: "Open MPI Users" <users_at_[hidden]>
> Date: Sunday, September 21, 2008, 9:35 AM
>
> Are you sure that you have matching versions of the MPI library and
> mpi.h file ? Open MPI and MPICH have different internal types for the
> opaque MPI objects (such as MPI_Comm). If you have mismatching mpi.h
> and mpi library, you'll transmit those as integer to the library while
> it is expecting pointers. This will obviously segfault very badly.
> Please make sure that you actually use the mpi.h from open MPI (by
> using Open MPI's mpicc) to compile your program when using Open MPI.
> Also make sure that you don't have another version of libmpi in your
> LD_LIBRARY_PATH that could be used instead of the one you used to
> compile.
>
> Aurelien
>
> Le 21 sept. 08 à 04:38, Shafagh Jafer a écrit :
>
> >
> > Ok. I noticed that whenever in my code, i use an MPI fucntion that
> has
> > "OMPI_DECLSPEC" in front of it in mpi.h , I get this segfault
> > error. Could some one please tell me what is "OMPI_DECLSPEC"??
> is it
> > a macro that I need to enable ?!?
> > forexample, in MPICH the function getsize in mpi.h looks like the
> > following:
> >
> > int MPI_Comm_size(MPI_Comm, int *);
> >
> > but the same function in OMPI apears as follows:
> > OMPI_DECLSPEC int MPI_Comm_size(MPI_Comm comm, int *size);
> >
> > --- On Sat, 9/20/08, Shafagh Jafer <barfy27_at_[hidden]> wrote:
> > From: Shafagh Jafer <barfy27_at_[hidden]>
> > Subject: Re: [OMPI users] Segmentation Fault--libc.so.
> > 6(__libc_start_main...
> > To: "Open MPI Users" <users_at_[hidden]>
> > Date: Saturday, September 20, 2008, 9:50 PM
> >
> > My code was working perfect when I had it with MPICH now I have
> > replaced that with OMPI. Could that be the problem?? Do I need to
> > change any part of my source code if I migrate from MPICH-1.2.6 to
> > OpenMPI-1.2.7?? Please let me know.
> >
> > --- On Sat, 9/20/08, Aurélien Bouteiller <bouteill_at_[hidden]>
> > wrote:
> > From: Aurélien Bouteiller <bouteill_at_[hidden]>
> > Subject: Re: [OMPI users] Segmentation Fault--libc.so.
> > 6(__libc_start_main...
> > To: "Open MPI Users" <users_at_[hidden]>
> > Date: Saturday, September 20, 2008, 6:54 AM
> >
> > Shafagh,
> >
> > You have a segfault in your own code. Because Open MPI detects it,
> it
> > forwards the error to you and pretty print it but Open MPI is not
> the
> > source of the bug. From the stack trace, I suggest you gdb debug the
> > PhysicalGetID function.
> >
> > Aurelien
> >
> > Le 19 sept. 08 à 22:22, Shafagh Jafer a écrit :
> >
> > > Hi every one,
> > > I need urgent help plz :-(
> > > I am getting the following error when i run my program. OpenMPI
> > > compilation was all fine and went well, but now i dont understand
> > > the source of this error:
> > > ============================================
> > > [node01:29264] *** Process received signal ***
> > > [node01:29264] Signal: Segmentation fault (11)
> > > [node01:29264] Signal code: Address not mapped (1)
> > > [node01:29264] Failing at address: 0xcf
> > > [node01:29264] [ 0] /lib/tls/libpthread.so.0 [0x7ccf80]
> > > [node01:29264] [ 1] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/
> cd
> > ++
> > > (physicalGetId__C10CommPhyMPI+0x14) [0x8305880]
> > > [node01:29264] [ 2] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/
> cd
> > ++
> > > (physicalCommGetId__Fv+0x43) [0x82ff81b]
> > > [node01:29264] [ 3] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/
> cd
> > ++
> > > (openComm__16StandAloneLoader+0x1f) [0x80fdf43]
> > > [node01:29264] [ 4] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/
> cd
> > ++
> > > (run__21ParallelMainSimulator+0x1640) [0x81ea53c]
> > > [node01:29264] [ 5] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/
> cd
> > ++
> > > (main+0xde) [0x80a58ce]
> > > [node01:29264] [ 6] /lib/tls/libc.so.6(__libc_start_main+0xda)
> > > [0xe3d79a]
> > > [node01:29264] [ 7] /nfs/sjafer/phd/openMPI/latest_cd++_timewarp/
> cd
> > ++
> > > (sinh+0x4d) [0x80a2221]
> > > [node01:29264] *** End of error message ***
> > > mpirun noticed that job rank 0 with PID 29264 on node node01
> exited
> > > on signal 11 (Segmentation fault).
> > > ===========================================
> > >
> > > _______________________________________________
> > > users mailing list
> > > users_at_[hidden]
> > > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> >
> >
> > --
> > * Dr. Aurélien Bouteiller
> > * Sr. Research Associate at Innovative Computing Laboratory
> > * University of Tennessee
> > * 1122 Volunteer Boulevard, suite 350
> > * Knoxville, TN 37996
> > * 865 974 6321
> >
> >
> >
> >
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

-- 
Jeff Squyres
Cisco Systems