Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: Re: [OMPI users] simple test problem hangs on mpi_finalize and consumes all system resources
From: Jeff Squyres (jsquyres) (jsquyres_at_[hidden])
Date: 2014-01-24 13:04:12


On Jan 24, 2014, at 12:50 PM, "Fischer, Greg A." <fischega_at_[hidden]> wrote:

> Yep. That was the problem. It works beautifully now.

Great!

> Thanks for prodding me to take another look.

I'd be embarrassed to admit how many times I make the same mistake. And I've been working on Open MPI for over 10 years. :-)

> With regards to openmpi-1.6.5, the system that I'm compiling and running on, SLES10, contains some pretty dated software (e.g. Linux 2.6.x, python 2.4, gcc 4.1.2). Is it possible there's simply an incompatibility lurking in there somewhere that would trip openmpi-1.6.5 but not openmpi-1.4.3?

Possibly, but I'd be a bit surprised if that were the case.

So are we back to the state of: stuck in MPI_Finalize in 1.6.x for hello_c.c?

-- 
Jeff Squyres
jsquyres_at_[hidden]
For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/