Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] problem with mpirun
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2010-06-11 07:15:15


I'm a afraid I don't know anything about OpenFoam, but it looks like it deliberately chose to abort due to some error (i.e., it then called MPI_ABORT to abort).

I don't know what those stack traces mean; you will likely have better luck asking your question on the OpenFoam support list.

Good luck!

On Jun 11, 2010, at 5:03 AM, <asmae.elbahlouli_at_[hidden]> wrote:

> hello,
> i'm doing a tutorial on OpenFoam, but when i run in parallel by typing "mpirun -np 30 foamProMesh -parallel | tee 2>&1 log/FPM.log"
> On the terminal window , after fews seconds of run, it iterate but i have at the end:
>
>
> tta201_at_linux-qv31:/media/OpenFoam/Travaux/F1car_asmaetest> mpirun -np 30 -machinefile machinefile foamProMesh -parallel | tee 2>&1 log/FPM.log
> /*---------------------------------------------------------------------------*\
> | | |
> | F ield | FOAM: The Open Source CFD Toolbox |
> | O peration | Version: 1.5-2.2 |
> | A nd | Web: http://www.iconcfd.com |
> | M anipulation | |
> \*---------------------------------------------------------------------------*/
> Exec : foamProMesh -parallel
> Date : Jun 11 2010
> Time : 10:42:24
> Host : Foam1
> PID : 9789
> Case : /media/OpenFoam/Travaux/F1car_asmaetest
> nProcs : 30
> Slaves :
> 29
> (
> Foam1.9790
> Foam1.9791
> Foam1.9792
> Foam2.9224
> Foam2.9225
> Foam2.9226
> Foam2.9227
> Foam3.8925
> Foam3.8926
> Foam3.8927
> ......
>
> Added patches in = 0 s
>
> Selecting decompositionMethod hierarchical
>
> Overall mesh bounding box : (-5.60160988792 -5.00165616875 -0.259253998544) (9.39931715541 5.00165616875 5.74982363461)
> Relative tolerance : 1e-06
> Absolute matching distance : 1.90053435613e-05
>
>
> Determining initial surface intersections
> -----------------------------------------
>
> [1]
> [1]
> [1] Face 42832 is in zone 5, its coupled face is in zone -1#0 Foam::error::printStack(Foam::Ostream&) in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libFOAM.so"
> #1 Foam::error::abort() in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libFOAM.so"
> #2 Foam::meshRefinement::checkCoupledFaceZones(Foam::polyMesh const&) in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libautoMesh.so"
> #3 main in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/applications/bin/linux64GccDPOpt/foamProMesh"
> #4 __libc_start_main in "/lib64/libc.so.6"
> #5 __gxx_personality_v0 at /usr/src/packages/BUILD/glibc-2.3/csu/../sysdeps/x86_64/elf/start.S:116
> [1]
> [1]
> [1] From function meshRefinement::checkCoupledFaceZones(const polyMesh&)
> [1] in file autoHexMesh/meshRefinement/meshRefinement.C at line 1180.
> [1]
> FOAM parallel run aborting
> [1]
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> [11]
> [11]
> [11] Face 42862 is in zone -1, its coupled face is in zone 5#0 Foam::error::printStack(Foam::Ostream&) in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libFOAM.so"
> #1 Foam::error::abort() in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libFOAM.so"
> #2 Foam::meshRefinement::checkCoupledFaceZones(Foam::polyMesh const&) in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libautoMesh.so"
> #3 main in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/applications/bin/linux64GccDPOpt/foamProMesh"
> #4 __libc_start_main in "/lib64/libc.so.6"
> #5 __gxx_personality_v0 at /usr/src/packages/BUILD/glibc-2.3/csu/../sysdeps/x86_64/elf/start.S:116
> [11]
> [11]
> [11] From function meshRefinement::checkCoupledFaceZones(const polyMesh&)
> [11] in file autoHexMesh/meshRefinement/meshRefinement.C at line 1180.
> [11]
> FOAM parallel run aborting
> [11]
> [8]
> [8]
> [8] Face 41663 is in zone -1, its coupled face is in zone 5#0 Foam::error::printStack(Foam::Ostream&) in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libFOAM.so"
> #1 Foam::error::abort() in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libFOAM.so"
> #2 Foam::meshRefinement::checkCoupledFaceZones(Foam::polyMesh const&) in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/lib/linux64GccDPOpt/libautoMesh.so"
> #3 main in "/media/OpenFoam/FOAMpro/FOAMpro-1.5-2.2/FOAM-1.5-2.2/applications/bin/linux64GccDPOpt/foamProMesh"
> #4 __libc_start_main in "/lib64/libc.so.6"
> #5 __gxx_personality_v0 at /usr/src/packages/BUILD/glibc-2.3/csu/../sysdeps/x86_64/elf/start.S:116
> [8]
> [8]
> [8] From function meshRefinement::checkCoupledFaceZones(const polyMesh&)
> [8] in file autoHexMesh/meshRefinement/meshRefinement.C at line 1180.
> [8]
> FOAM parallel run aborting
> [8]
> --------------------------------------------------------------------------
> mpirun has exited due to process rank 1 with PID 9790 on
> node 10.0.0.1 exiting without calling "finalize". This may
> have caused other processes in the application to be
> terminated by signals sent by mpirun (as reported here).
> --------------------------------------------------------------------------
> [linux-qv31:16344] 2 more processes have sent help message help-mpi-api.txt / mpi-abort
> [linux-qv31:16344] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

-- 
Jeff Squyres
jsquyres_at_[hidden]
For corporate legal information go to:
http://www.cisco.com/web/about/doing_business/legal/cri/