Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

From: George Bosilca (bosilca_at_[hidden])
Date: 2006-09-06 13:33:45

 From my perspective some [let's say #1 and #2) of the most important
features of an application that has to last for a while is the
readability and portability. And OMP code is far more readable than
pthread one. The loops look like loops, the critical sections are
obvious and the sequential meaning of the program is preserved.

On Sep 5, 2006, at 7:52 PM, Durga Choudhury wrote:

> My opinion would be to use pthreads, for a couple of reasons:
> 1. You don't need an OMP aware compiler; any old compiler would do.

Compilers can be downloaded for free these days. And most of them
have now OMP support. And on all operating systems (i.e. even the
free Microsoft compiler now has OMP support, and Windows was
definitively not the platform I expect to use for my OMP tasks).

> 2. The pthread library is more well adapted and hence might be more
> optimized than the code emitted from an OMP compiler.

The pthread library add a huge overhead for all operations. At this
level granularity quite often you need atomic locks and operations,
not critical sections protected by mutexes. Unfortunately, there is
no portable library that give you a common interface to atomic
operations (there was a BSD one at one point). Moreover, using
threads instead of OMP directive move the burden on the programmer.
Most of the people just cannot afford a one year student who has to
first understand and then add the correct pthread directive inside.
And for which result ... you don't even know that you will get the
fastest version. On the other side OMP compilers are getting smarter
and smarter every day. Today the results are quite impressive, just
imagine what will happens in few years.

> If your operating system is Linux, you may use the clone() system
> call directly; this would add further optimization at the expense
> of portability.

It's always a trade-off between performance and portability. What do
you want to loose in order to get the 1% performance gain ... And in
this case the only performance gain you will get is when you start
the threads, otherwise you will not improve anything. Generally,
people prefer to use threads pools in order to avoid the overhead of
creating and destroying threads all the time.


> Durga
> On 9/5/06, George Bosilca <bosilca_at_[hidden]> wrote:
> On Sep 5, 2006, at 3:19 AM, Aidaros Dev wrote:
> > Nowdays we hear about intel dual core processor, An Intel dual-core
> > processor consists of two complete execution cores in one physical
> > processor both running at the same frequency. Both cores share the
> > same packaging and the same interface with the chipset/memory.
> > Can I use MPI library to communicate these processors? Can we
> > consider as they are separated?
> Yes and yes. However, these architectures fit better on a different
> programming model. If you want to get the max performance out of
> them, a OMP approach (instead of MPI) is more suitable. Using
> processes on such architecture is just a waste of performance. One
> should use a thread model, with locking to insure the coordination
> between memory accesses. Or let the underlying libraries do their
> magic for you. As an example most of the mathematical codes based on
> BLAS can use the GOTO BLAS (developed at TACC) to get multi-code (and
> multi-CPU) support for free, as this library will do all BLAS
> operation in parallel using multiple threads.
> george.
> _______________________________________________
> users mailing list
> users_at_[hidden]
> --
> Devil wanted omnipresence;
> He therefore created communists.
> _______________________________________________
> users mailing list
> users_at_[hidden]

"Half of what I say is meaningless; but I say it so that the other
half may reach you"
                                   Kahlil Gibran