There won't be an official SRPM until 1.3.1 is released.
But to test if 1.3.1 is on-track to deliver a proper solution to you,
can you try a nightly tarball, perhaps in conjunction with our
It should build a trivial SRPM for you from the tarball. You'll
likely need to get the specfile, too, and put it in the same dir as
buildrpm.sh. The specfile is in the same SVN directory:
On Feb 20, 2009, at 3:51 PM, Jim Kusznir wrote:
> As long as I can still build the rpm for it and install it via rpm.
> I'm running it on a ROCKS cluster, so it needs to be an RPM to get
> pushed out to the compute nodes.
> On Fri, Feb 20, 2009 at 11:30 AM, Jeff Squyres <jsquyres_at_[hidden]>
>> On Feb 20, 2009, at 2:20 PM, Jim Kusznir wrote:
>>> I just went to www.open-mpi.org, went to download, then source rpm.
>>> Looks like it was actually 1.3-1. Here's the src.rpm that I pulled
>> Ah, gotcha. Yes, that's 1.3.0, SRPM version 1. We didn't make up
>> nomenclature. :-(
>>> The reason for this upgrade is it seems a user found some bug that
>>> be in the OpenMPI code that results in occasionally an MPI_Send()
>>> message getting lost. He's managed to reproduce it multiple times,
>>> and we can't find anything in his code that can cause it...He's got
>>> logs of mpi_send() going out, but the matching mpi_receive() never
>>> getting anything, thus killing his code. We're currently running
>>> 1.2.8 with ofed support (Haven't tried turning off ofed, etc. yet).
>> Ok. 1.3.x is much mo' betta' then 1.2 in many ways. We could
>> probably help
>> track down the problem, but if you're willing to upgrade to 1.3.x,
>> hopefully just make the problem go away.
>> Can you try a 1.3.1 nightly tarball?
>> Jeff Squyres
>> Cisco Systems
>> users mailing list
> users mailing list