Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] Using openmpi within python and crashes
From: John R. Cary (cary_at_[hidden])
Date: 2009-07-09 13:53:17

Our scenario is that we are running python, then importing a module
written in Fortran.
We run via:

 mpiexec -n 8 -x PYTHONPATH -x SIDL_DLL_PATH python

where the script calls into Fortran to call MPI_Init.

On 8 procs (but not one) we get hangs in the code (on some machines but
not others!).
Hard to tell precisely where, because it is in a PETSc method.

Running with valgrind

 mpiexec -n 8 -x PYTHONPATH -x SIDL_DLL_PATH valgrind python

gives a crash, with some salient output:

==936== Syscall param sched_setaffinity(mask) points to unaddressable
==936== at 0x39336DAA79: syscall (in /lib64/
==936== by 0x10BCBD58: opal_paffinity_linux_plpa_api_probe_init (in
==936== by 0x10BCE054: opal_paffinity_linux_plpa_init (in
==936== by 0x10BCC9F9:
opal_paffinity_linux_plpa_have_topology_information (in
==936== by 0x10BCBBFF: linux_module_init (in
==936== by 0x10BC99C3: opal_paffinity_base_select (in
==936== by 0x10B9DB83: opal_init (in
==936== by 0x10920C6C: orte_init (in
==936== by 0x10579D06: ompi_mpi_init (in
==936== by 0x10599175: PMPI_Init (in
==936== by 0x10E2BDF4: mpi_init (in
==936== by 0xDF30A1F: uedge_mpiinit_ (in
==936== Address 0x0 is not stack'd, malloc'd or (recently) free'd

This makes me think that our call to mpi_init is wrong. At

it says

  Because the Fortran and C versions of MPI_Init
<> are
different, there is a restriction on who can call MPI_Init
<>. The
version (Fortran or C) must match the main
  program. That is, if the main program is in C, then the C version of
<> must be
called. If the main program is in Fortran, the Fortran version must be

Should I infer from this that since python is a C code, one must call
the C version of MPI_Init (with argc, argv)?

Or since the module is written mostly in Fortran with mpi calls of only
the Fortran variety, I can initialize
with the Fortran MPI_Init?

Thanks.....John Cary