Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] btl_openib_cpc_include rdmacm questions
From: Brock Palen (brockp_at_[hidden])
Date: 2011-04-20 17:03:06

We managed to have another user hit the bug that causes collectives (this time MPI_Bcast() ) to hang on IB that was fixed by setting:

btl_openib_cpc_include rdmacm

My question is if we set this to the default on our system with an environment variable does it introduce any performance or other issues we should be aware of?

Is there a reason we should not use rdmacm?


Brock Palen
Center for Advanced Computing