Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] lammps MD code fails with Open MPI 1.3
From: Jeff Pummill (jpummil_at_[hidden])
Date: 2009-02-20 10:08:34


It's probably not the same issue as this is one of the very few codes
that I maintain which is C++ and not fortran :-(

It behaved similarly on another system when I built it against a new
version (1.0??) of MVAPICH. I had to roll back a version from that as well.

I may contact the lammps people and see if they know what's going on as
well.

Jeff F. Pummill
Senior Linux Cluster Administrator
TeraGrid Campus Champion - UofA
University of Arkansas
Fayetteville, Arkansas 72701
(479) 575 - 4590
http://hpc.uark.edu

"In theory, there is no difference between theory and
practice. But in practice, there is!" /-- anonymous/

Jeff Squyres wrote:
> Actually, there was a big Fortran bug that crept in after 1.3 that was
> just fixed on the trunk last night. If you're using Fortran
> applications with some compilers (e.g., Intel), the 1.3.1 nightly
> snapshots may have hung in some cases. The problem should be fixed in
> tonight's 1.3.1 nightly snapshot.
>
>
> On Feb 20, 2009, at 12:46 AM, Nysal Jan wrote:
>
>> It could be the same bug reported here
>> http://www.open-mpi.org/community/lists/users/2009/02/8010.php
>>
>> Can you try a recent snapshot of 1.3.1
>> (http://www.open-mpi.org/nightly/v1.3/) to verify if this has been fixed
>>
>> --Nysal
>>
>> On Thu, 2009-02-19 at 16:09 -0600, Jeff Pummill wrote:
>>> I built a fresh version of lammps v29Jan09 against Open MPI 1.3 which
>>> in turn was built with Gnu compilers v4.2.4 on an Ubuntu 8.04 x86_64
>>> box. This Open MPI build was able to generate usable binaries such as
>>> XHPL and NPB, but the lammps binary it generated was not usable.
>>>
>>> I tried it with a couple of different versions of the lammps source,
>>> but to no avail. No errors during the builds and a binary was created,
>>> but when executing the job it quickly exits with no messages other
>>> than:
>>>
>>> jpummil_at_stealth:~$ mpirun -np 4 -hostfile
>>> hosts /home/jpummil/lmp_Stealth-OMPI < in.testbench_small
>>> LAMMPS (22 Jan 2008)
>>>
>>> Interestingly, I downloaded Open MPI 1.2.8, built it with the same
>>> configure options I had used with 1.3, and it worked.
>>>
>>> I'm getting by fine with 1.2.8. I just wanted to file a possible bug
>>> report on 1.3 and see if others have seen this behavior.
>>>
>>> Cheers!
>>>
>>> --
>>> Jeff F. Pummill
>>> Senior Linux Cluster Administrator
>>> TeraGrid Campus Champion - UofA
>>> University of Arkansas
>>>
>>>
>>> _______________________________________________
>>> users mailing list
>>> users_at_[hidden]
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>
>