Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] IMB Sendrecv hangs with OpenMPI 1.6.5 and XRC
From: Jeff Squyres (jsquyres) (jsquyres_at_[hidden])
Date: 2014-04-23 11:19:28


A few suggestions:

- Try using Open MPI 1.8.1. It's the newest release, and has many improvements since the 1.6.x series.

- Try using "--mca btl openib,sm,self" (in both v1.6.x and v1.8.x). This allows Open MPI to use shared memory to communicate between processes on the same server, which can be a significant performance improvement over TCP or even IB.

On Apr 23, 2014, at 11:10 AM, "Sasso, John (GE Power & Water, Non-GE)" <John1.Sasso_at_[hidden]> wrote:

> I am running IMB (Intel MPI Benchmarks), the MPI-1 benchmarks, which was built with Intel 12.1 compiler suite and OpenMPI 1.6.5 (and running w/ OMPI 1.6.5). I decided to use the following for the mca parameters:
>
> --mca btl openib,tcp,self --mca btl_openib_receive_queues X,9216,256,128,32:X,65536,256,128,32
>
> where before, I always used “--mca btl openib,tcp,self”. This is for performance analysis. On the SendRecv benchmark at 32 processes, IMB hangs. I then tried:
>
> --mca btl_openib_receive_queues X,128,256,192,128:X,2048,256,128,32:X,12288,256,128,32:X,65536,256,128,32
>
> and IMB also hangs on the SendRecv benchmark, though at 64 processes.
>
> No errors have been recorded, not even in any system log files but ‘top’ shows the MPI tasks running. How can I go about troubleshooting this hang, as well as figuring out what (If any) MCA XRC-related parameters in btl_openib_receive_queues I have to specify to get IMB running properly? I did verify the IB cards are ConnectX.
>
> --john
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users

-- 
Jeff Squyres
jsquyres_at_[hidden]
For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/