Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] MPI_Irecv segmentation fault
From: Everette Clemmer (clemmece_at_[hidden])
Date: 2009-09-28 11:15:03


Yes I did, forgot to mention that in my last. Most of the example code
I've seen online passes the buffer variable by reference...

I think I've gotten past the segfault at this point, but it looks like
MPI_Isend is never completing. I have an MPI_Test() that sets a flag
immediately following the MPI_Irecv call, but the process seems to
hang before it gets to it. Not really sure why it wouldn't complete.

Everette

On Tue, Sep 22, 2009 at 9:24 AM, jody <jody.xha_at_[hidden]> wrote:
> Did you also change the "&buffer" to buffer in your MPI_Send call?
>
> Jody
>
> On Tue, Sep 22, 2009 at 1:38 PM, Everette Clemmer <clemmece_at_[hidden]> wrote:
>> Hmm, tried changing MPI_Irecv( &buffer....) to MPI_Irecv( buffer...)
>> and still no luck. Stack trace follows if that's helpful:
>>
>> prompt$ mpirun -np 2 ./display_test_debug
>> Sending 'q' from node 0 to node 1
>> [COMPUTER:50898] *** Process received signal ***
>> [COMPUTER:50898] Signal: Segmentation fault (11)
>> [COMPUTER:50898] Signal code:  (0)
>> [COMPUTER:50898] Failing at address: 0x0
>> [COMPUTER:50898] [ 0] 2   libSystem.B.dylib
>> 0x00007fff87e280aa _sigtramp + 26
>> [COMPUTER:50898] [ 1] 3   ???
>> 0x0000000000000000 0x0 + 0
>> [COMPUTER:50898] [ 2] 4   GLUT
>> 0x0000000100024a21 glutMainLoop + 261
>> [COMPUTER:50898] [ 3] 5   display_test_debug
>> 0x0000000100001444 xsMainLoop + 67
>> [COMPUTER:50898] [ 4] 6   display_test_debug
>> 0x0000000100001335 main + 59
>> [COMPUTER:50898] [ 5] 7   display_test_debug
>> 0x0000000100000d9c start + 52
>> [COMPUTER:50898] [ 6] 8   ???
>> 0x0000000000000001 0x0 + 1
>> [COMPUTER:50898] *** End of error message ***
>> mpirun noticed that job rank 0 with PID 50897 on node COMPUTER.local
>> exited on signal 15 (Terminated).
>> 1 additional process aborted (not shown)
>>
>> Thanks,
>> Everette
>>
>>
>> On Tue, Sep 22, 2009 at 2:28 AM, Ake Sandgren <ake.sandgren_at_[hidden]> wrote:
>>> On Mon, 2009-09-21 at 19:26 -0400, Everette Clemmer wrote:
>>>> Hey all,
>>>>
>>>> I'm getting a segmentation fault when I attempt to receive a single
>>>> character via MPI_Irecv. Code follows:
>>>>
>>>> void recv_func() {
>>>>               if( !MASTER ) {
>>>>                       char            buffer[ 1 ];
>>>>                       int             flag;
>>>>                       MPI_Request request;
>>>>                       MPI_Status      status;
>>>>
>>>>                       MPI_Irecv( &buffer, 1, MPI_CHAR, 0, MPI_ANY_TAG, MPI_COMM_WORLD, &request);
>>>
>>> It should be MPI_Irecv(buffer, 1, ...)
>>>
>>>> The segfault disappears if I comment out the MPI_Irecv call in
>>>> recv_func so I'm assuming that there's something wrong with the
>>>> parameters that I'm passing to it. Thoughts?
>>>
>>> --
>>> Ake Sandgren, HPC2N, Umea University, S-90187 Umea, Sweden
>>> Internet: ake_at_[hidden]   Phone: +46 90 7866134 Fax: +46 90 7866126
>>> Mobile: +46 70 7716134 WWW: http://www.hpc2n.umu.se
>>>
>>> _______________________________________________
>>> users mailing list
>>> users_at_[hidden]
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>
>>
>>
>>
>> --
>> - Everette
>>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>

-- 
- Everette