If you look at the code in that test, it has a --openmpi option you are supposed to set so that it runs properly for OMPI. Not sure if that's the problem here or not.
Did this used to run?
Note also that the test has a hardcoded version of 2.0 in it. I'm not sure if that could also be part of the problem.
I have MTT failures complaining about MPI2, but before I am opening a ticket, pls, have a look.
$/hpc/home/USERS/mtt/mtt-scratch/20090421220402_moo1_17859/installs/oma-nightly-1.3--gcc--1.3r404/install/bin/mpirun --host moo1,moo1,moo2,moo2,moo3,moo3,moo4,moo4 -np 8 --mca btl_openib_use_eager_rdma 1 --mca btl self,sm,openib /hpc/home/USERS/mtt/mtt-scratch/20090421220402_moo1_17859/installs/ogHK/tests/mpicxx/cxx-test-suite/src/mpi2c++_dynamics_test
MPI-2 C++ bindings MPI-2 dynamics test suite
Open MPI Version 2.0
*** There are delays built into some of the tests
*** Please let them complete
*** No test should take more than 10 seconds
Test suite running on 8 nodes
* MPI-2 Dynamics...
- Looking for "connect" program... PASS
- MPI::Get_version... FAIL
MPI2C++ test suite: NODE 0 - 2) ERROR in MPI::Get_version should be 2.1
MPI2C++ test suite: all ranks failed
MPI2C++ test suite: minor error
MPI2C++ test suite: attempting to finalize...
MPI2C++ test suite: terminated
devel mailing list