Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] MPIIO and EXT3 file systems
From: Tom Rosmond (rosmond_at_[hidden])
Date: 2011-08-18 11:46:46


We have a large fortran application designed to run doing IO with either
mpi_io or fortran direct access. On a linux workstation (16 AMD cores)
running openmpi 1.5.3 and Intel fortran 12.0 we are having trouble with
random failures with the mpi_io option which do not occur with
conventional fortran direct access. We are using ext3 file systems, and
I have seen some references hinting of similar problems with the
ext3/mpiio combination. The application with the mpi_io option runs
flawlessly on Cray architectures with Lustre file systems, so we are
also suspicious of the ext3/mpiio combination. Does anyone else have
experience with this combination that could shed some light on the
problem, and hopefully some suggested solutions?

T. Rosmond