Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: Daryl W. Grunau (dwg_at_[hidden])
Date: 2005-11-18 16:35:50


> Date: Fri, 18 Nov 2005 10:34:29 -0700
> From: "Daryl W. Grunau" <dwg_at_[hidden]>
> To: Brian Barrett <brbarret_at_[hidden]>
> Cc: users_at_[hidden]
> Subject: Re: [O-MPI users] OMPI 1.0 rc6 --with-bproc errors
>
> > Date: Thu, 17 Nov 2005 09:20:10 -0800
> > From: Brian Barrett <brbarret_at_[hidden]>
> > Subject: Re: [O-MPI users] OMPI 1.0 rc6 --with-bproc errors
> > To: Open MPI Users <users_at_[hidden]>
> > Message-ID: <6E3F2F6A-FB69-4879-A2B1-E286B5DB7738_at_[hidden]>
> > Content-Type: text/plain; charset=US-ASCII; delsp=yes; format=flowed
> >
> > Daryl -
> >
> > I'm unable to replicate your problem. I was testing on a Fedora Core
> > 3 system with Clustermatic 5. Is is possible that you have a random
> > dso from a previous build in your installation path? How are you
> > running mpirun -- maybe I'm just not hitting the same code path you
> > are...
>
> Brian, thanks for trying to replicate. I'm actually not building any dso's
> for OMPI, merely static libs and recompiling my app. I'm running as
> follows:
>
> mpirun -H 200,201,202,203 -np 4 ./a.out
>
> The last successful build I've had was rc4, which succeeds in running the
> above test. I'll try to build/install 1.0 just announced and let you know.

Looks like this problem got fixed in 1.0!!! Thanks,

Daryl