Just for the record, I am using:
Open MPI 1.4.2 (released 2 days ago), gcc 4.4.3 (g++, gfortran).
All on Fedora Core 12, kernel 126.96.36.199-99.fc12.x86_64 #1 SMP.
The machine is a white box with two-way
quad-core Intel Xeon (Nehalem) E5540 @ 2.53GHz, 48GB RAM.
Hyperthreading is currently turned on.
But please, don't spend more time on this.
You already gave a lot of help.
I guess this would be fixed if I could reinstall the OS
using a more stable Linux distribution, not Fedora.
You and Jeff reported that your
Nehalems get along with Open MPI.
I would guess other people have functional Open MPI + Nehalem systems.
All I can think of is that some mess with the OS/gcc is causing
the trouble here.
(Yes, to avoid trouble I always compile MPI
and applications with the same compiler set.
And keep a bunch of Open MPI builds to match our needs.)
Douglas Guptill wrote:
> Hello Gus:
> On Thu, May 06, 2010 at 11:26:57AM -0400, Gus Correa wrote:
>> Would you know which gcc you used to build your Open MPI?
>> Or did you use Intel icc instead?
> Intel ifort and icc. I build OpenMPI with the same compiler, and same
> options, that I build my application with.
> I have been tempted to try and duplicate your problem. Would that be a
> helpful experiment? gcc, OpenMPI 1.4.1, IIRC ?