Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] MPI_Type_create_darray causes MPI_File_set_view to crash when ndims=2, array_of_gsizes[0]>array_of_gsizes[1]
From: Antonio Molins (amolins_at_[hidden])
Date: 2008-10-31 11:19:39


Hi again,

The problem in a nutshell: it looks like, when I use
MPI_Type_create_darray with an argument array_of_gsizes where
array_of_gsizes[0]>array_of_gsizes[1], the datatype returned goes
through MPI_Type_commit() just fine, but then it causes
MPI_File_set_view to crash!! Any idea as to why this is happening?

A

>>
>> --------------------------------------------------------------------------------
>> Antonio Molins, PhD Candidate
>> Medical Engineering and Medical Physics
>> Harvard - MIT Division of Health Sciences and Technology
>> --
>> "When a traveler reaches a fork in the road,
>> the â„“1 -norm tells him to take either one way or the other,
>> but the â„“2 -norm instructs him to head off into the bushes. "
>>
>> John F. Claerbout and Francis Muir, 1973
>> --------------------------------------------------------------------------------
>>
>> *** glibc detected *** double free or corruption (!prev):
>> 0x0000000000cf4130 ***
>> [login4:26709] *** Process received signal ***
>> [login4:26708] *** Process received signal ***
>> [login4:26708] Signal: Aborted (6)
>> [login4:26708] Signal code: (-6)
>> [login4:26709] Signal: Segmentation fault (11)
>> [login4:26709] Signal code: Address not mapped (1)
>> [login4:26709] Failing at address: 0x18
>> [login4:26708] [ 0] /lib64/tls/libpthread.so.0 [0x36ff10c5b0]
>> [login4:26708] [ 1] /lib64/tls/libc.so.6(gsignal+0x3d) [0x36fe62e26d]
>> [login4:26708] [ 2] /lib64/tls/libc.so.6(abort+0xfe) [0x36fe62fa6e]
>> [login4:26708] [ 3] /lib64/tls/libc.so.6 [0x36fe6635f1]
>> [login4:26708] [ 4] /lib64/tls/libc.so.6 [0x36fe6691fe]
>> [login4:26708] [ 5] /lib64/tls/libc.so.6(__libc_free+0x76)
>> [0x36fe669596]
>> [login4:26708] [ 6] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.0
>> [0x2a962cc4ae]
>> [login4:26708] [ 7] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.
>> 0(ompi_ddt_destroy+0x65) [0x2a962cd31d]
>> [login4:26708] [ 8] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.
>> 0(MPI_Type_free+0x5b) [0x2a962f654f]
>> [login4:26708] [ 9] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(ADIOI_Flatten+0x1804) [0x2aa4603612]
>> [login4:26708] [10] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(ADIOI_Flatten_datatype+0xe7) [0x2aa46017fd]
>> [login4:26708] [11] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(ADIO_Set_view+0x14f) [0x2aa45ecb57]
>> [login4:26708] [12] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(mca_io_romio_dist_MPI_File_set_view+0x1dd)
>> [0x2aa46088a9]
>> [login4:26708] [13] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so [0x2aa45ec288]
>> [login4:26708] [14] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.
>> 0(MPI_File_set_view+0x53) [0x2a963002ff]
>> [login4:26708] [15] ./bin/test2(_ZN14pMatCollection3getEiP7pMatrix
>> +0xc3) [0x42a50b]
>> [login4:26708] [16] ./bin/test2(main+0xc2e) [0x43014a]
>> [login4:26708] [17] /lib64/tls/libc.so.6(__libc_start_main+0xdb)
>> [0x36fe61c40b]
>> [login4:26708] [18] ./bin/test2(_ZNSt8ios_base4InitD1Ev+0x42)
>> [0x41563a]
>> [login4:26708] *** End of error message ***
>> [login4:26709] [ 0] /lib64/tls/libpthread.so.0 [0x36ff10c5b0]
>> [login4:26709] [ 1] /lib64/tls/libc.so.6 [0x36fe66882b]
>> [login4:26709] [ 2] /lib64/tls/libc.so.6 [0x36fe668f8d]
>> [login4:26709] [ 3] /lib64/tls/libc.so.6(__libc_free+0x76)
>> [0x36fe669596]
>> [login4:26709] [ 4] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.0
>> [0x2a962cc4ae]
>> [login4:26709] [ 5] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.
>> 0(ompi_ddt_release_args+0x93) [0x2a962d5641]
>> [login4:26709] [ 6] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.0
>> [0x2a962cc514]
>> [login4:26709] [ 7] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.
>> 0(ompi_ddt_release_args+0x93) [0x2a962d5641]
>> [login4:26709] [ 8] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.0
>> [0x2a962cc514]
>> [login4:26709] [ 9] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.
>> 0(ompi_ddt_destroy+0x65) [0x2a962cd31d]
>> [login4:26709] [10] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.
>> 0(MPI_Type_free+0x5b) [0x2a962f654f]
>> [login4:26709] [11] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(ADIOI_Flatten+0x147) [0x2aa4601f55]
>> [login4:26709] [12] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(ADIOI_Flatten+0x1569) [0x2aa4603377]
>> [login4:26709] [13] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(ADIOI_Flatten_datatype+0xe7) [0x2aa46017fd]
>> [login4:26709] [14] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(ADIO_Set_view+0x14f) [0x2aa45ecb57]
>> [login4:26709] [15] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so(mca_io_romio_dist_MPI_File_set_view+0x1dd)
>> [0x2aa46088a9]
>> [login4:26709] [16] /opt/apps/intel10_1/openmpi/1.3/lib/openmpi/
>> mca_io_romio.so [0x2aa45ec288]
>> [login4:26709] [17] /opt/apps/intel10_1/openmpi/1.3/lib/libmpi.so.
>> 0(MPI_File_set_view+0x53) [0x2a963002ff]
>> [login4:26709] [18] ./bin/test2(_ZN14pMatCollection3getEiP7pMatrix
>> +0xc3) [0x42a50b]
>> [login4:26709] [19] ./bin/test2(main+0xc2e) [0x43014a]
>> [login4:26709] [20] /lib64/tls/libc.so.6(__libc_start_main+0xdb)
>> [0x36fe61c40b]
>> [login4:26709] [21] ./bin/test2(_ZNSt8ios_base4InitD1Ev+0x42)
>> [0x41563a]
>> [login4:26709] *** End of error message ***
>> --------------------------------------------------------------------------
>> mpirun noticed that process rank 2 with PID 26708 on node
>> login4.ranger.tacc.utexas.edu exited on signal 6 (Aborted).
>> --------------------------------------------------------------------------
>>
>>
>>
>>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>
> --------------------------------------------------------------------------------
> Antonio Molins, PhD Candidate
> Medical Engineering and Medical Physics
> Harvard - MIT Division of Health Sciences and Technology
> --
> "Y así del poco dormir y del mucho leer,
> se le secó el cerebro de manera que vino
> a perder el juicio".
> Miguel de Cervantes
> --------------------------------------------------------------------------------
>
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

--------------------------------------------------------------------------------
Antonio Molins, PhD Candidate
Medical Engineering and Medical Physics
Harvard - MIT Division of Health Sciences and Technology

--
"When a traveler reaches a fork in the road,
the â„“1 -norm tells him to take either one way or the other,
but the â„“2 -norm instructs him to head off into the bushes. "
			John F. Claerbout and Francis Muir, 1973
--------------------------------------------------------------------------------