Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Using dual infiniband HCA cards
From: Pavel Shamis (Pasha) (pashash_at_[hidden])
Date: 2009-07-30 07:05:26


>
> We have a computational cluster which is consisting of 8 HP Proliant
> ML370G5 with 32GB ram.
> Each node has a Melanox single port infiniband DDR HCA card (20Gbit/s)
> and connected each other through
> a Voltaire ISR9024D-M DDR infiniband switch.
>
> Now we want to increase the bandwidth to 40GBit/s adding second
> infiniband cards to each node.
>
> I want to ask if this is possible, if yes how?
>
You need to check if it possible to add one more Infiniband card to
your motherboard. As well you need verify that you PCI-EX link and the
chipset
will allow to utilize resources of 2 HCAs.
You may temporary take 2 hca from some of your machines
and add them to another pair machines. It will allow to you make some
benchmarking with 2 hcas.

 From driver and OpenMPI perspective 2 (and more) hca configuration is
supported by default.

Pasha.