Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Why compilig in global paths (only) for configuretion files?
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2008-09-17 08:28:54


On Sep 17, 2008, at 5:49 AM, Paul Kapinos wrote:

>>> But the setting of the environtemt variable OPAL_PREFIX to an
>>> appropriate value (assuming PATH and LD_LIBRARY_PATH are setted
>>> too) is not enough to let the OpenMPI rock&roll from the new
>>> lokation.
>> Hmm. It should be.
>
> (update) it works with "truly" OpenMPI, but it works *not* with SUN
> Cluster Tools 8.0 (which is also an OpenMPI). So, it seems be an SUN
> problem and not general problem of openMPI. Sorry for false relating
> the problem.

Ah, gotcha. I guess my Sun colleagues on this list will need to
address that. ;-)

> The only trouble we have now are the error messages like
>
> --------------------------------------------------------------------------
> Sorry! You were supposed to get help about:
> no hca params found
> from the file:
> help-mpi-btl-openib.txt
> But I couldn't find any file matching that name. Sorry!
> --------------------------------------------------------------------------
>
> (the job still runs without problems! :o)
>
> if running openmpi from new location, and the old location being
> removed. (if the old location being also persistense there is no
> error, so it seems to be an attempt to access to an file on old path).

Doh; that's weird.

> Maybe we have to explicitly pass the OPAL_PREFIX environment
> variable to all processes?

Hmm. I don't need to do this in my 1.2.7 installation. I do
something like this (I assume you're using rsh/ssh as a launcher?):

# OMPI installed to /home/jsquyres/bogus, then mv'ed to /home/jsquyres/
bogus/foo
tcsh% set path = (/home/jsquyres/bogus/foo/bin $path)
tcsh% setenv LD_LIBRARY_PATH /home/jsquyres/bogus/foo/lib:
$LD_LIBRARY_PATH
tcsh% setenv OPAL_PREFIX /home/jsquyres/bogus/foo
tcsh% mpirun --hostfile whatever hostname
...works fine
tcsh% mpicc ring.c -o ring
tcsh% mpirun --hostfile whatever --mca btl openib,self ring
...works fine

Is this different for you?

>>> Note one of configure files contained in Sun ClusterMPI 8.0 (see
>>> attached file). The paths are really hard-coded in instead of
>>> usage of variables; this makes the package really not relocable
>>> without parsing the configure files.
>
> Did you (or anyone reading this message) have any contact to SUN
> developers to point to this circumstance? *Why* do them use hard-
> coded paths? :o)

I don't know -- this sounds like an issue with the Sun CT 8 build
process. It could also be a by-product of using the combined 32/64
feature...? I haven't used that in forever and I don't remember the
restrictions. Terry/Rolf -- can you comment?

-- 
Jeff Squyres
Cisco Systems