Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: Re: [OMPI users] Initializing OMPI with invoking the array constructor on Fortran derived types causes the executable to crash
From: Gus Correa (gus_at_[hidden])
Date: 2013-01-11 13:19:26


Hi Stefan

Don't you need to allocate xx, yy and conc, before you use them?
In the short program below, they are declared as allocatable,
but not actually allocated.

I hope this helps,
Gus Correa

On 01/11/2013 09:58 AM, Stefan Mauerberger wrote:
> Dear Paul!
>
> Thanks for your reply. This problem seems to get complicated.
>
> Unfortunately, I can not reproduce what you are describing. I tried with
> some GCCs as 4.7.1, 4.7.2 and 4.8.0 (20121008). As you suggested,
> replacing the MPI_Init and MPI_Finalize calls with WRITE(*,*) "foooo"
> and commenting out use mpi, everything is just fine. No segfault no core
> dump, just the result as I expect it (I put a write(*,*) size(conc) in,
> which must print 2). I simply compiled with a bare mpif90 ... and
> executed typing mpirun -np 1 ./a.out .
> I also tried on three different architectures - all 64-bit - and, as
> soon as MPI_Init is invoked, the program gets core dumped.
>
> I also tried with IBM's MPI implementation just with the difference
> using include 'mpif.h' instead of use mpi. Everything is fine and the
> result is as in serial runs.
>
> Well, it's not surprising that 4.4.x has its problems. Using modern
> Fortran as F03, GCC in a version younger than 4.7.x is just mandatory.
>
> Cheers,
> Stefan
>
>
>
> On Fri, 2013-01-11 at 14:26 +0100, Paul Kapinos wrote:
>> This is hardly an Open MPI issue:
>>
>> switch the calls to MPI_Init, MPI_Finalize against
>> WRITE(*,*) "foooo"
>> comment aut 'USE mpi' .... an see your error (SIGSEGV) again, now without any
>> MPI part in the program.
>> So my suspiction is this is an bug in your GCC version. Especially because there
>> is no SIGSEGV using 4.7.2 GCC (whereby it crasehs using 4.4.6)
>>
>> ==> Update your compilers!
>>
>>
>> On 01/11/13 14:01, Stefan Mauerberger wrote:
>>> Hi There!
>>>
>>> First of all, this is my first post here. In case I am doing something
>>> inappropriate pleas be soft with me. On top of that I am not quite sure
>>> whether that issue is related to Open MPI or GCC.
>>>
>>> Regarding my problem: Well, it is a little bulky, see below. I could
>>> figure out that the actual crash is caused by invoking Fortran's array
>>> constructor [ xx, yy ] on derived-data-types xx and yy. The one key
>>> factor is that those types have allocatable member variables.
>>> Well, that fact points to blame gfortran for that. However, the crash
>>> does not occur if MPI_Iinit is not called in before. Compiled as a
>>> serial program everything works perfectly fine. I am pretty sure, the
>>> lines I wrote are valid F2003 code.
>>>
>>> Here is a minimal working example:
>>> PROGRAM main
>>> USE mpi
>>>
>>> IMPLICIT NONE
>>>
>>> INTEGER :: ierr
>>>
>>> TYPE :: test_typ
>>> REAL, ALLOCATABLE :: a(:)
>>> END TYPE
>>>
>>> TYPE(test_typ) :: xx, yy
>>> TYPE(test_typ), ALLOCATABLE :: conc(:)
>>>
>>> CALL mpi_init( ierr )
>>>
>>> conc = [ xx, yy ]
>>>
>>> CALL mpi_finalize( ierr )
>>>
>>> END PROGRAM main
>>> Just compile with mpif90 ... and execute leads to:
>>>> *** glibc detected *** ./a.out: free(): invalid pointer: 0x00007fefd2a147f8 ***
>>>> ======= Backtrace: =========
>>>> /lib/x86_64-linux-gnu/libc.so.6(+0x7eb96)[0x7fefd26dab96]
>>>> ./a.out[0x400fdb]
>>>> ./a.out(main+0x34)[0x401132]
>>>> /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed)[0x7fefd267d76d]
>>>> ./a.out[0x400ad9]
>>> With commenting out 'CALL MPI_Init' and 'MPI_Finalize' everything seems to be fine.
>>>
>>> What do you think: Is this a OMPI or a GCC related bug?
>>>
>>> Cheers,
>>> Stefan
>>>
>>>
>>> _______________________________________________
>>> users mailing list
>>> users_at_[hidden]
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>
>>
>>
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users