Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: Re: [OMPI users] Why compilig in global paths (only) for configuretion files?
From: Paul Kapinos (kapinos_at_[hidden])
Date: 2008-09-15 11:22:03

Hi Jeff, hi all!

Jeff Squyres wrote:
> Short answer: yes, we do compile in the prefix path into OMPI. Check
> out this FAQ entry; I think it'll solve your problem:

Yes, reading man pages helps!
Thank you to provide useful help.

But the setting of the environtemt variable OPAL_PREFIX to an
appropriate value (assuming PATH and LD_LIBRARY_PATH are setted too) is
not enough to let the OpenMPI rock&roll from the new lokation.

Because of the fact, that all the files containing settings for
opal_wrapper, which are located in share/openmpi/ and called e.g.
mpif77-wrapper-data.txt, contain (defined by installation with --prefix)
hard-coded paths, too.

I have fixed the problem by parsing all the files share/openmpi/*.txt
and replacing the old path through new path. This nasty solution seems
to work.

But, is there an elegant way to do this correctness, maybe to
re-generate the config-files in share/openmpi/

And last but not least, the FAQ on the web site you provided (see link
above) does not containn any info on the need to modufy the wrapper
configuretion files. Maybe this section schould be upgraded?

Best regards Paul Kapinos

> On Sep 8, 2008, at 5:33 AM, Paul Kapinos wrote:
>> Hi all!
>> We are using OpenMPI on an variety of machines (running Linux,
>> Solaris/Sparc and /Opteron) using couple of compilers (GCC, Sun
>> Studio, Intel, PGI, 32 and 64 bit...) so we have at least 15 versions
>> of each release of OpenMPI (SUN Cluster Tools not included).
>> This shows, that we have to support an complete petting zoo of
>> OpenMPI's. Sometimes we may need to move things around.
>> If OpenMPI is being configured, the install path may be provided using
>> --prefix keyword, say so:
>> ./configure --prefix=/my/love/path/for/openmpi/tmp1
>> After "gmake all install" in ...tmp1 an installation of OpenMPI may be
>> found.
>> Then, say, we need to *move* this Version to an another path, say
>> /my/love/path/for/openmpi/blupp
>> Of course we have to set $PATH and $LD_LIBRARY_PATH accordingly (we
>> can that ;-)
>> And if we tried to use OpenMPI from new location, we got error message
>> like
>> $ ./mpicc
>> Cannot open configuration file
>> /my/love/path/for/openmpi/tmp1/share/openmpi/mpicc-wrapper-data.txt
>> Error parsing data file mpicc: Not found
>> (note the old installation path used)
>> That looks for me, that the install path provided with --prefix in
>> configuration step, is compiled into opal_wrapper executable file and
>> opal_wrapper works iff the set of configuration files is in this path.
>> But after move of the OpenMP installation directory the configuration
>> files aren't there...
>> An side effect of this behaviour is the certainty that binary
>> distributions of OpenMPI (RPM's) are not relocatable. That's
>> uncomfortably. (Actually, this mail is initiated by the fact that Sun
>> ClusterTools RPM's are not relocatable)
>> So, does this behavior have an deeper sence I cannot recognise, or
>> maybe the configuring of global paths is not needed?
>> What I mean, is that the paths for the configuration files, which
>> opal_wrapper need, may be setted locally like ../share/openmpi/***
>> without affectiong the integrity of OpenMPI. Maybe there were were
>> more places where the usage of local paths may be needed to allowe
>> movable (relocable) OpenMPI.
>> What do you mean about?
>> Best regards
>> Paul Kapinos
>> <kapinos.vcf>_______________________________________________
>> users mailing list
>> users_at_[hidden]