Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: [OMPI users] Fwd: an error when running MPI on 2 machines
From: Paul Gribelyuk (paul.quant_at_[hidden])
Date: 2013-02-09 14:02:43


> Hello,
> I am getting the following stacktrace when running a simple hello world MPI C++ program on 2 machines:
>
>
> mini:mpi_cw paul$ mpirun --prefix /usr/local/Cellar/open-mpi/1.6.3 --hostfile hosts_home -np 2 ./pi 1000000
> rank and name: 0 aka mini.local
> [home-mini:12175] *** Process received signal ***
> [home-mini:12175] Signal: Segmentation fault: 11 (11)
> [home-mini:12175] Signal code: Address not mapped (1)
> [home-mini:12175] Failing at address: 0x1042e0000
> [home-mini:12175] [ 0] 2 libsystem_c.dylib 0x00007fff94050cfa _sigtramp + 26
> [home-mini:12175] [ 1] 3 mca_btl_tcp.so 0x000000010397092c best_addr + 2620
> [home-mini:12175] [ 2] 4 pi 0x0000000103649d24 start + 52
> [home-mini:12175] [ 3] 5 ??? 0x0000000000000002 0x0 + 2
> [home-mini:12175] *** End of error message ***
> rank: 0 sum: 1.85459
> --------------------------------------------------------------------------
> mpirun noticed that process rank 1 with PID 12175 on node home-mini.local exited on signal 11 (Segmentation fault: 11).
> --------------------------------------------------------------------------
>
>
>
> I get a similar result even when I don't use --prefix since the .bashrc file on the remote machine is correctly pointing to PATH and LD_LIBRARY_PATH
>
> Any help with this seg fault is greatly appreciated. Thanks.
>
> -Paul