Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] Using dual infiniband HCA cards
From: Sefa Arslan (sefa_at_[hidden])
Date: 2009-07-30 03:19:24


We have a computational cluster which is consisting of 8 HP Proliant
ML370G5 with 32GB ram.
Each node has a Melanox single port infiniband DDR HCA card (20Gbit/s)
and connected each other through
a Voltaire ISR9024D-M DDR infiniband switch.

Now we want to increase the bandwidth to 40GBit/s adding second
infiniband cards to each node.

I want to ask if this is possible, if yes how?

Do I have to make a infiniband-bonding configuration or openmpi is
already able to use the second card with doubling the bandwidth?

Is there some one who employed such configurations??


Sefa Arslan