There is a bug in that tarball which was fixed as of yesterday. However, the patch that you need was the cause of the bug, so the fix for your problem is no longer in the 1.4 branch.
As you probably recall, I had cautioned that the fix might not make it to the 1.4 series. At the time, I was concerned about timing. It turned out that the timing was okay, but that the complete fix requires more change to the 1.4 series than the OMPI community was comfortable in making.
So the fix for your original problem will be in the 1.5 release, hopefully coming in the not-too-distant future. In the interim, you should be okay using a tarball from the developer's trunk - it appears to be pretty stable at the moment. I would suggest grabbing a tarball from it and then stabilizing there until the 1.5 release.
On Jan 6, 2010, at 9:56 AM, Marcia Cristina Cera wrote:
> I am using the OpenMPI v1.4a1r22335 to run an MPI application that creates
> dynamically processes.
> The application behavior is like explained in a previous e-mail
> The application is launched by a command line such as:
> $ mpirun -hostfile myhosts -np 1 myapp
> myhosts describes 2 nodes:
> node1 slots=8
> node2 slots=8
> My application runs as expected creating dynamic processes into the two nodes.
> After compute, all processes (static and dynamic ones) finalizes too -- confirmed
> looking 'top' and 'ps' commands.
> But, the mpirun remains running and the application never liberate the shell.
> I try use mpiexec, but it also hangs.
> If I run locally (without -hostfile), the mpirun does not hang!
> Someone could help me?!
> For awhile, I create a script to kill the mpirun to enable the execution of many consecutive mpirun calls,
> but it is not a "beautiful" solution :)
> users mailing list