Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] 4 PCI-Express Gigabit ethernet NIcs
From: Allan Menezes (amenezes007_at_[hidden])
Date: 2009-05-08 21:01:03

Dear Anybody,
  Does openmpi version 1.3.2 for Fedora Core 10 x86_64 work with 4
gigabit pci-express ethernet cards per node stably.
I tried it on six Asus P5Q-VM motherboards with 4 cards and 8GB ram and
Intel Quad core Cpus each as follows:
eth0 - intel pro 1000 pt pci express gigabit cards.
eth1 - TP LINK's TG-3468 realtek r8111B chipset pci express gigabit
eth2 - realtek 8111C chipset gigabit pci express ethernet builtv in
on mobo
eth3 - TP LINK's TG-3468 realtek r8111B chipset pci express gigabit
with all using mtu's of 3000 and the latest intel and realtek drivers
from their respective websites
and hand configured and compiled kernel
I tried hpl-2.0 and gotoblas for checking my cluster and get approx 220
GFlops if i use
only eth0, eth1,eth3 or eth0, eth2, eth3 stably
but i get 203 GFlops with eth0, eth1,eth2,eth3 and the hpl test fail
after about the third test.
Any help would be very much appreciated as i would like to use 4 eth
cards per node.
Note: the measured performance of all cards is approximately 922 MBits/s
with jumbo frames of 3000
using Netpipe and NPtcp and with four cards between two nodes i measure
with NPmpi
compiled with openmpi approximately 3400 Mbits/s which is good! Scales
linearly with 4 times 900 Mbits/sec
THank you,
Allan Menezes