Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: [OMPI users] error in (Open MPI) 1.3.3r21324-ct8.2-b09b-r31
From: Lydia Heck (lydia.heck_at_[hidden])
Date: 2010-07-15 05:19:03


We are running Sun's build of Open Mpi 1.3.3r21324-ct8.2-b09b-r31
(HPC8.2) and one code that runs perfectly fine under
HPC8.1 (Open MPI) 1.3r19845-ct8.1-b06b-r21 and before fails with

[oberon:08454] *** Process received signal ***
[oberon:08454] Signal: Segmentation Fault (11)
[oberon:08454] Signal code: Address not mapped (1)
[oberon:08454] Failing at address: 0
/opt/SUNWhpc/HPC8.2/sun/lib/amd64/libopen-pal.so.0.0.0:0x4b89e
/lib/amd64/libc.so.1:0xd0f36
/lib/amd64/libc.so.1:0xc5a72
0x0 [ Signal 11 (SEGV)]
/opt/SUNWhpc/HPC8.2/sun/lib/amd64/libmpi.so.0.0.0:MPI_Alloc_mem+0x7f
/opt/SUNWhpc/HPC8.2/sun/lib/amd64/libmpi.so.0.0.0:MPI_Sendrecv_replace+0x31e
/opt/SUNWhpc/HPC8.2/sun/lib/amd64/libmpi_f77.so.0.0.0:PMPI_SENDRECV_REPLACE+0x94
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:mpi_cyclic_transfer_+0xd9
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:cycle_particles_and_interpolate_+0x94b
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:interpolate_field_+0xc30
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:MAIN_+0xe68
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:main+0x3d
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:0x62ac
[oberon:08454] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 8454 on node oberon exited on
signal 11 (Segmentation Fault).

I have not tried to get and build a newer Open Mpi, so I do not know if the
problem propagates into the more recent versions.

If the developers are interested, I could ask the user to prepare the code for
you to have a look at the problem which looks like to be in MPI_Alloc_mem.

Best wishes,
Lydia Heck

------------------------------------------
Dr E L Heck

University of Durham
Institute for Computational Cosmology
Ogden Centre
Department of Physics
South Road

DURHAM, DH1 3LE
United Kingdom

e-mail: lydia.heck_at_[hidden]

Tel.: + 44 191 - 334 3628
Fax.: + 44 191 - 334 3645
___________________________________________