Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Cofigure(?) problem building /1.5.3 on ScientificLinux6.0
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2011-07-22 14:37:58


Your RUNME script is a *very* strange way to build Open MPI. It starts with a massive copy:

cp -r /home/pk224850/OpenMPI/openmpi-1.5.3/AUTHORS /home/pk224850/OpenMPI/openmpi-1.5.3/CMakeLists.txt <...much snipped...> .

Why are you doing this kind of copy? I suspect that the GNU autotools' timestamps are getting all out of whack when you do this kind of copy, and therefore when you run "configure", it tries to re-autogen itself.

To be clear: when you expand OMPI from a tarball, you shouldn't need the GNU Autotools installed at all -- the tarball is pre-bootstrapped exactly to avoid you needing to use the Autotools (much less any specific version of the Autotools).

I suspect that if you do this:

-----
tar xf openmpi-1.5.3.tar.bz2
cd openmpi-1.5.3
./configure ....etc.
-----

everything will work just fine.

On Jul 22, 2011, at 11:12 AM, Paul Kapinos wrote:

> Dear OpenMPI volks,
> currently I have a problem by building the version 1.5.3 of OpenMPI on
> Scientific Linux 6.0 systems, which seem vor me to be a configuration
> problem.
>
> After the configure run (which seem to terminate without error code),
> the "gmake all" stage produces errors and exits.
>
> Typical is the output below.
>
> Fancy: the 1.4.3 version on same computer can be build with no special
> trouble. Both the 1.4.3 and 1.5.3 versions can be build on other
> computer running CentOS 5.6.
>
> In each case I build 16 versions at all (4 compiler * 32bit/64bit *
> support for multithreading ON/OFF). The same error arise in all 16 versions.
>
> Can someone give a hint about how to avoid this issue? Thanks!
>
> Best wishes,
>
> Paul
>
>
> Some logs and configure are downloadable here:
> https://gigamove.rz.rwth-aachen.de/d/id/2jM6MEa2nveJJD
>
> The configure line is in RUNME.sh, the
> logs of configure and build stage in log_* files; I also attached the
> config.log file and the configure itself (which is the standard from the
> 1.5.3 release).
>
>
> ######################################################################
>
>
> CDPATH="${ZSH_VERSION+.}:" && cd . && /bin/sh
> /tmp/pk224850/linuxc2_11254/openmpi-1.5.3mt_linux64_gcc/config/missing
> --run aclocal-1.11 -I config
> sh: config/ompi_get_version.sh: No such file or directory
> /usr/bin/m4: esyscmd subprocess failed
>
> <last message repeated 12x>
>
> configure.ac:953: warning: OMPI_CONFIGURE_SETUP is m4_require'd but not
> m4_defun'd
> config/ompi_mca.m4:37: OMPI_MCA is expanded from...
> configure.ac:953: the top level
> configure.ac:953: warning: AC_COMPILE_IFELSE was called before
> AC_USE_SYSTEM_EXTENSIONS
> ../../lib/autoconf/specific.m4:386: AC_USE_SYSTEM_EXTENSIONS is expanded
> from...
> opal/mca/paffinity/hwloc/hwloc/config/hwloc.m4:152:
> HWLOC_SETUP_CORE_AFTER_C99 is expanded from...
> ../../lib/m4sugar/m4sh.m4:505: AS_IF is expanded from...
> opal/mca/paffinity/hwloc/hwloc/config/hwloc.m4:22: HWLOC_SETUP_CORE is
> expanded from...
> opal/mca/paffinity/hwloc/configure.m4:40: MCA_paffinity_hwloc_CONFIG is
> expanded from...
> config/ompi_mca.m4:540: MCA_CONFIGURE_M4_CONFIG_COMPONENT is expanded
> from...
> config/ompi_mca.m4:326: MCA_CONFIGURE_FRAMEWORK is expanded from...
> config/ompi_mca.m4:247: MCA_CONFIGURE_PROJECT is expanded from...
> configure.ac:953: warning: AC_RUN_IFELSE was called before
> AC_USE_SYSTEM_EXTENSIONS
>
>
>
>
> --
> Dipl.-Inform. Paul Kapinos - High Performance Computing,
> RWTH Aachen University, Center for Computing and Communication
> Seffenter Weg 23, D 52074 Aachen (Germany)
> Tel: +49 241/80-24915
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

-- 
Jeff Squyres
jsquyres_at_[hidden]
For corporate legal information go to:
http://www.cisco.com/web/about/doing_business/legal/cri/