Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] mpirun runs in serial even I set np to several processors
From: Djordje Romanic (djordje8_at_[hidden])
Date: 2014-04-15 13:59:48


Hi,

It is working now. It shows:
--------------------------------------------
starting wrf task 0 of 4
 starting wrf task 1 of 4
 starting wrf task 2 of 4
 starting wrf task 3 of 4
---------------------------------------------
Thank you so much!!! You helped me a lot! Finally :) And plus I know the
difference between OpenMP and Open MPI (well, to be honest not completely,
but more than i knew before). :D

Thanks,

Djordje

On Tue, Apr 15, 2014 at 11:57 AM, Gus Correa <gus_at_[hidden]> wrote:

> Hi Djordje
>
> "locate mpirun" shows items labled "intel", "mpich", and "openmpi", maybe
> more.
> Is it Ubuntu or Debian?
>
> Anyway, if you got this mess from somebody else,
> instead of sorting it out,
> it may save you time and headaches installing Open MPI from
> source.
> Since it is a single machine, there are no worries about
> having an homogeneous installation for several computers (which
> could be done if needed, though).
>
> 0. Make sure you have gcc, g++, and gfortran installed,
> including any "devel" packages that may exist.
> [apt-get or yum should tell you]
> If something is missing, install it.
>
> 1. Download the Open MPI (a.k.a OMPI) tarball to a work directory
> of your choice,
> say /home/djordje/inst/openmpi/1.8 (create the directory if needed),
> and untar the tarball (tar -jxvf ...)
>
> http://www.open-mpi.org/software/ompi/v1.8/
>
> 2. Configure it to be installed in yet another directory under
> your home, say /home/djordje/sw/openmpi/1.8 (with --prefix).
>
> cd /home/djordje/inst/openmpi/1.8
>
> ./configure --prefix=/home/djordje/sw/openmpi/1.8 CC=gcc, CXX=g++,
> FC=gfortran
>
> [Not sure if with 1.8 there is a separate F77 interface, if there is
> add F77=gfortran to the configure command line above.
> Also, I am using OMPI 1.6.5,
> but my recollection is that Jeff would phase off mpif90 and mpif77 in
> favor of a single mpifortran of sorts. Please check the OMPI README file.]
>
> Then do
>
> make
> make install
>
> 3. Setup your environment variables PATH and LD_LIBRARY_PATH
> to point to *this* Open MPI installation ahead of anything else.
> This is easily done in your .bashrc or .tcshrc/.cshrc file,
> depending on which shell you use
>
> .bashrc :
> export PATH=/home/djordje/sw/openmpi/1.8/bin:$PATH
> export LD_LIBRARY_PATH=/home/djordje/sw/openmpi/1.8/lib:$LD_LIBRARY_PATH
>
> .tcshrc/.cshrc:
>
> setenv PATH /home/djordje/sw/openmpi/1.8/bin:$PATH
> setenv LD_LIBRARY_PATH /home/djordje/sw/openmpi/1.8/lib:$LD_LIBRARY_PATH
>
> 4. Logout, login again (or open a new terminal), and check if you
> get the right mpirun, etc:
>
> which mpicc
> which mpif90
> which mpirun
>
> They should point to items in /home/djordje/sw/openmpi/1.8/bin
>
> 5. Rebuild WRF from scratch.
>
> 6. Check if WRF got the libraries right:
>
> ldd wrf.exe
>
> This should show mpi libraries in /home/djordje/sw/openmpi/1.8/lib
>
> 7. Run WRF
> mpirun -np 4 wrf.exe
>
>
> I hope this helps,
> Gus Correa
>
>
>
>
> On 04/14/2014 08:21 PM, Djordje Romanic wrote:
>
>> Hi,
>>
>> Thanks for this guys. I think I might have two MPI implementations
>> installed because 'locate mpirun' gives (see bold lines) :
>> -----------------------------------------
>> /etc/alternatives/mpirun
>> /etc/alternatives/mpirun.1.gz
>> */home/djordje/Build_WRF/LIBRARIES/mpich/bin/mpirun*
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/intel/4.
>> 1.1.036/linux-x86_64/bin/mpirun
>> <http://4.1.1.036/linux-x86_64/bin/mpirun>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/intel/4.
>> 1.1.036/linux-x86_64/bin64/mpirun
>> <http://4.1.1.036/linux-x86_64/bin64/mpirun>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/intel/4.
>> 1.1.036/linux-x86_64/ia32/bin/mpirun
>> <http://4.1.1.036/linux-x86_64/ia32/bin/mpirun>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/intel/4.
>> 1.1.036/linux-x86_64/intel64/bin/mpirun
>> <http://4.1.1.036/linux-x86_64/intel64/bin/mpirun>
>>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/openmpi/
>> 1.4.3/linux-x86_64-2.3.4/gnu4.5/bin/mpirun
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/openmpi/
>> 1.4.3/linux-x86_64-2.3.4/gnu4.5/share/man/man1/mpirun.1
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/openmpi/
>> 1.6.4/linux-x86_64-2.3.4/gnu4.6/bin/mpirun
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/openmpi/
>> 1.6.4/linux-x86_64-2.3.4/gnu4.6/share/man/man1/mpirun.1
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/bin/mpirun
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/bin/mpirun>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/bin/mpirun.mpich
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/bin/mpirun.mpich>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/bin/mpirun.mpich2
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/bin/mpirun.mpich2>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun.mpich
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun.mpich>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun.mpich2
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun.mpich2>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/lib/linux_amd64/libmpirun.so
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/lib/
>> linux_amd64/libmpirun.so>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/lib/linux_ia32/libmpirun.so
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/ia32/lib/
>> linux_ia32/libmpirun.so>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/lib/linux_amd64/libmpirun.so
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/lib/linux_amd64/libmpirun.so>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/lib/linux_ia32/libmpirun.so
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/lib/linux_ia32/libmpirun.so>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.2.0.0/linux64_2.6-x86-glibc_2.3.4/share/man/man1/mpirun.1.gz
>> <http://8.2.0.0/linux64_2.6-x86-glibc_2.3.4/share/man/man1/mpirun.1.gz>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/bin/mpirun
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/bin/mpirun>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/bin/mpirun.mpich
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/bin/mpirun.mpich>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/bin/mpirun.mpich2
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/bin/mpirun.mpich2>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun.mpich
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun.mpich>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun.mpich2
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/bin/mpirun.mpich2>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/lib/linux_amd64/libmpirun.so
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/lib/
>> linux_amd64/libmpirun.so>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/lib/linux_ia32/libmpirun.so
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/ia32/lib/
>> linux_ia32/libmpirun.so>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/lib/linux_amd64/libmpirun.so
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/lib/linux_amd64/libmpirun.so>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/lib/linux_ia32/libmpirun.so
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/lib/linux_ia32/libmpirun.so>
>> /home/djordje/StarCCM/Install/STAR-CCM+8.06.007/mpi/platform/
>> 8.3.0.2/linux64_2.6-x86-glibc_2.3.4/share/man/man1/mpirun.1.gz
>> <http://8.3.0.2/linux64_2.6-x86-glibc_2.3.4/share/man/man1/mpirun.1.gz>
>> */usr/bin/mpirun*
>>
>> /usr/bin/mpirun.openmpi
>> /usr/lib/openmpi/include/openmpi/ompi/runtime/mpiruntime.h
>> /usr/share/man/man1/mpirun.1.gz
>> /usr/share/man/man1/mpirun.openmpi.1.gz
>> /var/lib/dpkg/alternatives/mpirun
>> -----------------------------------------
>> This is a single machine. I actually just got it... another user used it
>> for 1-2 years.
>>
>> Is this a possible cause of the problem?
>>
>> Regards,
>> Djordje
>>
>>
>> On Mon, Apr 14, 2014 at 7:06 PM, Gus Correa <gus_at_[hidden]
>> <mailto:gus_at_[hidden]>> wrote:
>>
>> Apologies for stirring even more the confusion by mispelling
>> "Open MPI" as "OpenMPI".
>> "OMPI" doesn't help either, because all OpenMP environment
>> variables and directives start with "OMP".
>> Maybe associating the names to
>> "message passing" vs. "threads" would help?
>>
>> Djordje:
>>
>> 'which mpif90' etc show everything in /usr/bin.
>> So, very likely they were installed from packages
>> (yum, apt-get, rpm ...),right?
>> Have you tried something like
>> "yum list |grep mpi"
>> to see what you have?
>>
>> As Dave, Jeff and Tom said, this may be a mixup of different
>> MPI implementations at compilation (mpicc mpif90) and runtime
>> (mpirun).
>> That is common, you may have different MPI implementations installed.
>>
>> Other possibilities that may tell what MPI you have:
>>
>> mpirun --version
>> mpif90 --show
>> mpicc --show
>>
>> Yet another:
>>
>> locate mpirun
>> locate mpif90
>> locate mpicc
>>
>> The ldd didn't show any MPI libraries, maybe they are static
>> libraries.
>>
>> An alternative is to install Open MPI from source,
>> and put it in a non-system directory
>> (not /usr/bin, not /usr/local/bin, etc).
>>
>> Is this a single machine or a cluster?
>> Or perhaps a set of PCs that you have access to?
>> If it is a cluster, do you have access to a filesystem that is
>> shared across the cluster?
>> On clusters typically /home is shared, often via NFS.
>>
>> Gus Correa
>>
>>
>> On 04/14/2014 05:15 PM, Jeff Squyres (jsquyres) wrote:
>>
>> Maybe we should rename OpenMP to be something less confusing --
>> perhaps something totally unrelated, perhaps even non-sensical.
>> That'll end lots of confusion!
>>
>> My vote: OpenMP --> SharkBook
>>
>> It's got a ring to it, doesn't it? And it sounds fearsome!
>>
>>
>>
>> On Apr 14, 2014, at 5:04 PM, "Elken, Tom" <tom.elken_at_[hidden]
>> <mailto:tom.elken_at_[hidden]>> wrote:
>>
>> That’s OK. Many of us make that mistake, though often as a
>> typo.
>> One thing that helps is that the correct spelling of Open
>> MPI has a space in it,
>>
>> but OpenMP does not.
>>
>> If not aware what OpenMP is, here is a link:
>> http://openmp.org/wp/
>>
>> What makes it more confusing is that more and more apps.
>>
>> offer the option of running in a hybrid mode, such as WRF,
>> with OpenMP threads running over MPI ranks with the same executable.
>> And sometimes that MPI is Open MPI.
>>
>>
>> Cheers,
>> -Tom
>>
>> From: users [mailto:users-bounces_at_open-__mpi.org
>>
>> <mailto:users-bounces_at_[hidden]>] On Behalf Of Djordje
>> Romanic
>> Sent: Monday, April 14, 2014 1:28 PM
>> To: Open MPI Users
>> Subject: Re: [OMPI users] mpirun runs in serial even I set
>> np to several processors
>>
>> OK guys... Thanks for all this info. Frankly, I didn't know
>> these diferences between OpenMP and OpenMPI. The commands:
>> which mpirun
>> which mpif90
>> which mpicc
>> give,
>> /usr/bin/mpirun
>> /usr/bin/mpif90
>> /usr/bin/mpicc
>> respectively.
>>
>> A tutorial on how to compile WRF
>> (http://www.mmm.ucar.edu/wrf/__OnLineTutorial/compilation___
>> tutorial.php
>> <http://www.mmm.ucar.edu/wrf/OnLineTutorial/compilation_
>> tutorial.php>)
>>
>> provides a test program to test MPI. I ran the program and
>> it gave me the output of successful run, which is:
>> ------------------------------__---------------
>>
>> C function called by Fortran
>> Values are xx = 2.00 and ii = 1
>> status = 2
>> SUCCESS test 2 fortran + c + netcdf + mpi
>> ------------------------------__---------------
>>
>> It uses mpif90 and mpicc for compiling. Below is the output
>> of 'ldd ./wrf.exe':
>>
>>
>> linux-vdso.so.1 => (0x00007fff584e7000)
>> libpthread.so.0 =>
>> /lib/x86_64-linux-gnu/__libpthread.so.0 (0x00007f4d160ab000)
>> libgfortran.so.3 =>
>> /usr/lib/x86_64-linux-gnu/__libgfortran.so.3
>> (0x00007f4d15d94000)
>> libm.so.6 => /lib/x86_64-linux-gnu/libm.so.__6
>> (0x00007f4d15a97000)
>> libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc___s.so.1
>> (0x00007f4d15881000)
>> libc.so.6 => /lib/x86_64-linux-gnu/libc.so.__6
>>
>> (0x00007f4d154c1000)
>> /lib64/ld-linux-x86-64.so.2 (0x00007f4d162e8000)
>> libquadmath.so.0 =>
>> /usr/lib/x86_64-linux-gnu/__libquadmath.so.0
>>
>> (0x00007f4d1528a000)
>>
>>
>>
>> On Mon, Apr 14, 2014 at 4:09 PM, Gus Correa
>> <gus_at_[hidden] <mailto:gus_at_[hidden]>> wrote:
>> Djordje
>>
>> Your WRF configure file seems to use mpif90 and mpicc (line
>> 115 & following).
>> In addition, it also seems to have DISABLED OpenMP (NO
>> TRAILING "I")
>> (lines 109-111, where OpenMP stuff is commented out).
>> So, it looks like to me your intent was to compile with MPI.
>>
>> Whether it is THIS MPI (OpenMPI) or another MPI (say MPICH,
>> or MVAPICH,
>> or Intel MPI, or Cray, or ...) only your environment can tell.
>>
>> What do you get from these commands:
>>
>> which mpirun
>> which mpif90
>> which mpicc
>>
>> I never built WRF here (but other people here use it).
>> Which input do you provide to the command that generates the
>> configure
>> script that you sent before?
>> Maybe the full command line will shed some light on the
>> problem.
>>
>>
>> I hope this helps,
>> Gus Correa
>>
>>
>> On 04/14/2014 03:11 PM, Djordje Romanic wrote:
>> to get help :)
>>
>>
>>
>> On Mon, Apr 14, 2014 at 3:11 PM, Djordje Romanic
>> <djordje8_at_[hidden] <mailto:djordje8_at_[hidden]>
>> <mailto:djordje8_at_[hidden] <mailto:djordje8_at_[hidden]>>>
>> wrote:
>>
>> Yes, but I was hoping to get. :)
>>
>>
>> On Mon, Apr 14, 2014 at 3:02 PM, Jeff Squyres (jsquyres)
>> <jsquyres_at_[hidden] <mailto:jsquyres_at_[hidden]>
>> <mailto:jsquyres_at_[hidden] <mailto:jsquyres_at_[hidden]>>>
>> wrote:
>>
>> If you didn't use Open MPI, then this is the wrong
>> mailing list
>> for you. :-)
>>
>> (this is the Open MPI users' support mailing list)
>>
>>
>> On Apr 14, 2014, at 2:58 PM, Djordje Romanic
>> <djordje8_at_[hidden] <mailto:djordje8_at_[hidden]>
>> <mailto:djordje8_at_[hidden]
>>
>> <mailto:djordje8_at_[hidden]>>> wrote:
>>
>> > I didn't use OpenMPI.
>> >
>> >
>> > On Mon, Apr 14, 2014 at 2:37 PM, Jeff Squyres
>> (jsquyres)
>> <jsquyres_at_[hidden] <mailto:jsquyres_at_[hidden]>
>> <mailto:jsquyres_at_[hidden] <mailto:jsquyres_at_[hidden]>>>
>> wrote:
>> > This can also happen when you compile your
>> application with
>> one MPI implementation (e.g., Open MPI), but then
>> mistakenly use
>> the "mpirun" (or "mpiexec") from a different MPI
>> implementation
>> (e.g., MPICH).
>> >
>> >
>> > On Apr 14, 2014, at 2:32 PM, Djordje Romanic
>> <djordje8_at_[hidden] <mailto:djordje8_at_[hidden]>
>> <mailto:djordje8_at_[hidden] <mailto:djordje8_at_[hidden]>>>
>> wrote:
>> >
>> > > I compiled it with: x86_64 Linux, gfortran
>> compiler with
>> gcc (dmpar). dmpar - distributed memory option.
>> > >
>> > > Attached is the self-generated configuration
>> file. The
>> architecture specification settings start at line
>> 107. I didn't
>> use Open MPI (shared memory option).
>> > >
>> > >
>> > > On Mon, Apr 14, 2014 at 1:23 PM, Dave Goodell
>> (dgoodell)
>> <dgoodell_at_[hidden] <mailto:dgoodell_at_[hidden]>
>> <mailto:dgoodell_at_[hidden] <mailto:dgoodell_at_[hidden]>>>
>> wrote:
>> > > On Apr 14, 2014, at 12:15 PM, Djordje Romanic
>> <djordje8_at_[hidden] <mailto:djordje8_at_[hidden]>
>> <mailto:djordje8_at_[hidden] <mailto:djordje8_at_[hidden]>>>
>> wrote:
>> > >
>> > > > When I start wrf with mpirun -np 4
>> ./wrf.exe, I get this:
>> > > >
>> ------------------------------__-------------------
>>
>> > > > starting wrf task 0 of
>> 1
>> > > > starting wrf task 0 of
>> 1
>> > > > starting wrf task 0 of
>> 1
>> > > > starting wrf task 0 of
>> 1
>> > > >
>> ------------------------------__-------------------
>>
>> > > > This indicates that it is not using 4
>> processors, but 1.
>> > > >
>> > > > Any idea what might be the problem?
>> > >
>> > > It could be that you compiled WRF with a
>> different MPI
>> implementation than you are using to run it (e.g.,
>> MPICH vs.
>> Open MPI).
>> > >
>> > > -Dave
>> > >
>> > > ______________________________
>> ___________________
>> > > users mailing list
>> > > users_at_[hidden]
>> <mailto:users_at_[hidden]> <mailto:users_at_[hidden]
>> <mailto:users_at_[hidden]>>
>>
>> > >
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>> > >
>> > >
>> <configure.wrf>_____________________________________________
>> ______
>> > > users mailing list
>> > > users_at_[hidden]
>> <mailto:users_at_[hidden]> <mailto:users_at_[hidden]
>> <mailto:users_at_[hidden]>>
>>
>> > >
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>>
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>> >
>> >
>> > --
>> > Jeff Squyres
>> > jsquyres_at_[hidden] <mailto:jsquyres_at_[hidden]>
>> <mailto:jsquyres_at_[hidden] <mailto:jsquyres_at_[hidden]>>
>>
>>
>> > For corporate legal information go to:
>> http://www.cisco.com/web/__about/doing_business/legal/__cri/
>> <http://www.cisco.com/web/about/doing_business/legal/cri/>
>> >
>> > ______________________________
>> ___________________
>>
>> > users mailing list
>> > users_at_[hidden] <mailto:users_at_[hidden]>
>> <mailto:users_at_[hidden] <mailto:users_at_[hidden]>>
>>
>> >
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>> >
>> > ______________________________
>> ___________________
>>
>> > users mailing list
>> > users_at_[hidden] <mailto:users_at_[hidden]>
>> <mailto:users_at_[hidden] <mailto:users_at_[hidden]>>
>>
>> >
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>>
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>>
>>
>> --
>> Jeff Squyres
>> jsquyres_at_[hidden] <mailto:jsquyres_at_[hidden]>
>> <mailto:jsquyres_at_[hidden] <mailto:jsquyres_at_[hidden]>>
>>
>>
>> For corporate legal information go to:
>> http://www.cisco.com/web/__about/doing_business/legal/__cri/
>> <http://www.cisco.com/web/about/doing_business/legal/cri/>
>>
>> _________________________________________________
>>
>> users mailing list
>> users_at_[hidden] <mailto:users_at_[hidden]>
>> <mailto:users_at_[hidden] <mailto:users_at_[hidden]>>
>>
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>>
>>
>>
>>
>>
>> _________________________________________________
>>
>> users mailing list
>> users_at_[hidden] <mailto:users_at_[hidden]>
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>>
>>
>> _________________________________________________
>>
>> users mailing list
>> users_at_[hidden] <mailto:users_at_[hidden]>
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>>
>> _________________________________________________
>>
>> users mailing list
>> users_at_[hidden] <mailto:users_at_[hidden]>
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>>
>>
>>
>>
>> _________________________________________________
>>
>> users mailing list
>> users_at_[hidden] <mailto:users_at_[hidden]>
>> http://www.open-mpi.org/__mailman/listinfo.cgi/users
>>
>> <http://www.open-mpi.org/mailman/listinfo.cgi/users>
>>
>>
>>
>>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>