Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: Brian Barrett (brbarret_at_[hidden])
Date: 2006-02-27 20:14:34

On Feb 27, 2006, at 8:50 AM, Pierre Valiron wrote:

> - Make completed nicely, excepted compiling ompi/mpi/f90/mpi.f90
> which took nearly half an hour to complete. I suspect the
> optimization flags in FFLAGS are not important for applications,
> and I could use -O0 or -O1 instead.

You probably won't see any performance impact at all if you compile
the Fortran 90 layer of Open MPI with no optimizations. It's a very
thin wrapper and the compiler isn't going to be able to do much with
it anyway. One other thing - if you know your F90 code never sends
arrays greater than dimension X (X defaults to 4), you can speed
things up immensly by configuring Open MPI with the option --with-f90-

> - However the resulting executable fails to launch:
> valiron_at_icare ~/config > mpirun --prefix /users/valiron/lib/
> openmpi-1.0.2a9 -np 2 a.out
> Segmentation fault (core dumped)
> - The problem seems buried into open-mpi:
> valiron_at_icare ~/config > pstack core
> core 'core' of 27996: mpirun --prefix /users/valiron/lib/
> openmpi-1.0.2a9 -np 2 a.out
> fffffd7fff05dfe0 strlen () + 20
> fffffd7fff0b6ab3 vsprintf () + 33
> fffffd7fff2e4211 opal_vasprintf () + 41
> fffffd7fff2e41c8 opal_asprintf () + 98
> 00000000004098a3 orterun () + 63
> 0000000000407214 main () + 34
> 000000000040708c ???????? ()

Ugh... Yes, we're probably doing something wrong there.
Unfortunately, neither Jeff nor I have access to an Opteron box
running Solaris and I can't replicate the problem on either a
UltraSparc running Solaris or an Opteron running Linux. Could you
compile Open MPI with CFLAGS set to "-g -O -xtarget=opteron -
xarch=amd64". Hopefully being able to see the callstack with some
line numbers will help a bit.


   Brian Barrett
   Open MPI developer