Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Can't get a fully functional openmpi build on Mac OSX Mavericks
From: Ronald Cohen (rhcohen_at_[hidden])
Date: 2014-01-15 13:49:30


I just sent out another post with the c file attached.

If you can get that to work, and even if you can't can you tell me what
configure options you use, and what version of open-mpi? Thanks.


On Wed, Jan 15, 2014 at 10:36 AM, Ralph Castain <rhc_at_[hidden]> wrote:

> BTW: could you send me your sample test code?
> On Wed, Jan 15, 2014 at 10:34 AM, Ralph Castain <rhc_at_[hidden]> wrote:
>> I regularly build on Mavericks and run without problem, though I haven't
>> tried a parallel IO app. I'll give yours a try later, when I get back to my
>> Mac.
>> On Wed, Jan 15, 2014 at 10:04 AM, Ronald Cohen <rhcohen_at_[hidden]> wrote:
>>> I have been struggling trying to get a usable build of openmpi on Mac
>>> OSX Mavericks (10.9.1). I can get openmpi to configure and build without
>>> error, but have problems after that which depend on the openmpi version.
>>> With 1.6.5, make check fails the opal_datatype_test, ddt_test, and
>>> ddt_raw tests. The various atomic_* tests pass. See checklogs_1.6.5,
>>> attached as a .gz file.
>>> Following suggestions from openmpi discussions I tried openmpi version
>>> 1.7.4rc1. In this case make check indicates all tests passed. But when I
>>> proceeded to try to build a parallel code (parallel HDF5) it failed.
>>> Following an email exchange with the HDF5 support people, they suggested I
>>> try to compile and run the attached bit of simple code Sample_mpio.c (which
>>> they supplied) which does not use any HDF5, but just attempts a parallel
>>> write to a file and parallel read. That test failed when requesting more
>>> than 1 processor -- which they say indicates a failure of the openmpi
>>> installation. The error message was:
>>> MPI_INIT: argc 1
>>> MPI_INIT: argc 1
>>> Testing simple C MPIO program with 2 processes accessing file
>>> ./
>>> (Filename can be specified via program argument)
>>> Proc 0: hostname=Ron-Cohen-MBP.local
>>> Proc 1: hostname=Ron-Cohen-MBP.local
>>> Proc 0: MPI_File_open with MPI_MODE_EXCL failed (MPI_ERR_FILE: invalid
>>> file)
>>> MPI_ABORT[0]: comm MPI_COMM_WORLD errorcode 1
>>> MPI_BCAST[1]: buffer 7fff5a483048 count 1 datatype MPI_INT root 0 comm
>>> I then went back to my openmpi directories and tried running some of the
>>> individual tests in the test and examples directories. In particular in
>>> test/class I found one test that seem to not be run as part of make check
>>> which failed, even with one processor; this is opal_bitmap. Not sure if
>>> this is because 1.7.4rc1 is incomplete, or there is something wrong with
>>> the installation, or maybe a 32 vs 64 bit thing? The error message is
>>> mpirun detected that one or more processes exited with non-zero status,
>>> thus causing the job to be terminated. The first process to do so was:
>>> Process name: [[48805,1],0]
>>> Exit code: 255
>>> Any suggestions?
>>> More generally has anyone out there gotten an openmpi build on Mavericks
>>> to work with sufficient success that they can get the attached
>>> Sample_mpio.c (or better yet, parallel HDF5) to build?
>>> Details: Running Mac OS X 10.9.1 on a mid-2009 Macbook pro with 4 GB
>>> memory; tried openmpi 1.6.5 and 1.7.4rc1. Built openmpi against the stock
>>> gcc that comes with XCode 5.0.2, and gfortran 4.9.0.
>>> Files attached: config.log.gz, openmpialllog.gz (output of running
>>> ompi_info --all), checklog2.gz (output of make.check in top openmpi
>>> directory).
>>> I am not attaching logs of make and install since those seem to have
>>> been successful, but can generate those if that would be helpful.
>>> _______________________________________________
>>> users mailing list
>>> users_at_[hidden]
> _______________________________________________
> users mailing list
> users_at_[hidden]