Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

From: Brian Barrett (bbarrett_at_[hidden])
Date: 2007-03-06 10:32:53


Sure, we can add a FAQ entry on that :).

At present, configure decides whether Open MPI will be installed on a
case sensitive file-system or not based on what the build file system
does. Which is far from perfect, but covers 99.9% of the cases. You
happen to be the .1%, but we do have an option for you. You can
specify --with-cs-fs or --without-cs-fs to specify whether the
installation filesystem is case sensitive or not (overriding the auto-
detection).

Of course, I suppose I could add a sanity check during "make install"
to ensure that the installation filesystem really is case sensitive
if we expect it to be. mmm... I'll add that to the long term todo
list. For now, I think a FAQ entry will do.

Brian

On Mar 6, 2007, at 2:24 AM, Christian Simon wrote:

> Dear developers,
>
> I "switched" from Lam-MPI to Open MPI recently. I am using MacOS X
> server
> on small clusters, previously with XLF/XLC on G5, now gfortran/gcc
> with Intels.
>
> Since users are used to Unix file systems, since most applications/
> libraries compilations are not aware of HFS+ file system case
> insensitivity, I have installed a UFS formatted disk on our new
> cluster.
>
> Being a careful administrator, I configured/compiled OpenMPI as a
> user, on the UFS partition.
> Then I installed it as root, on an HFS+ system partition.
>
> When I tried to install Scalapack, BLACS compilation failed miserably:
>
> BI_EmergencyBuff.c: In function 'void BI_EmergencyBuff(int)':
> BI_EmergencyBuff.c:34: error: invalid conversion from 'void*' to
> 'char*'
> make[2]: *** [BI_EmergencyBuff.o] Error 1
> make[1]: *** [INTERN] Error 2
> make: *** [MPI] Error 2
>
> This is, I guess, due to confusion between wrappers :
>
> $/usr/local/openmpi-1.1.4_32bits/bin/mpic++
> i686-apple-darwin8-g++-4.0.1: no input files
>
> seems ok, but:
>
> $ /usr/local/openmpi-1.1.4_32bits/bin/mpicc
> i686-apple-darwin8-g++-4.0.1: no input files
>
> is wrong...
> Re-compiling OpenMPI on an HFS+ filesystem, I get:
>
> $ /usr/local/openmpi-1.1.4_32bits_hfs/bin/mpic++
> i686-apple-darwin8-g++-4.0.1: no input files
>
> and
>
> $ /usr/local/openmpi-1.1.4_32bits_hfs/bin/mpicc
> i686-apple-darwin8-gcc-4.0.1: no input files
>
> which is correct.
> Then BLACS/Scalapack and others get compiled without troubles.
> (I have not tested execution yet !)
>
> Is my explanation right ?
>
> If yes, although the documentation is excellent, and FAQ already well
> detailed, could you please add a caveat somewhere:
> OpenMPI's configure is smarter than the average: it is case
> sensitiveness aware.
>
> Anyway, many thanks for your great great job !
> --
> Dr. Christian SIMON, Maitre de Conferences
> Laboratoire LI2C-UMR7612 Bat. F74, piece 757
> Universite Pierre et Marie Curie Tel:+33.1.44.27.32.65
> Case 51 Fax:+33.1.44.27.32.28
> 4 Place Jussieu
> 75252 Paris Cedex 05
> France/Europe
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users