Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: [OMPI users] Hybrid MPI/Pthreads program behaves differently on two different machines with same hardware
From: 吕慧伟 (lvhuiwei_at_[hidden])
Date: 2011-10-24 02:37:02


Dear List,

I have a hybrid MPI/Pthreads program named "my_hybrid_app", this program is
memory-intensive and take advantage of multi-threading to improve memory
throughput. I run "my_hybrid_app" on two machines, which have same hardware
configuration but different OS and GCC. The problem is: when I run
"my_hybrid_app" with one process, two machines behaves the same: the more
number of threads, the better the performance; however, when I
run "my_hybrid_app" with two or more processes. The first machine still
increase performance with more threads, the second machine degrades in
performance with more threads.

Since running "my_hybrid_app" with one process behaves correctly, I suspect
my linking to MPI library has some problem. Would somebody point me in the
right direction? Thanks in advance.

Attached are the commandline used, my machine informantion and link
informantion.
p.s. 1: Commandline

single process: ./my_hybrid_app <number of threads>
multiple process: mpirun -np 2 ./my_hybrid_app <number of threads>

p.s. 2: Machine Informantion

The first machine is CentOS 5.3 with GCC 4.1.2:

Target: x86_64-redhat-linux

Configured with: ../configure --prefix=/usr --mandir=/usr/share/man
--infodir=/usr/share/info --enable-shared --enable-threads=posix
--enable-checking=release --with-system-zlib --enable-__cxa_atexit
--disable-libunwind-exceptions --enable-libgcj-multifile
--enable-languages=c,c++,objc,obj-c++,java,fortran,ada --enable-java-awt=gtk
--disable-dssi --enable-plugin
--with-java-home=/usr/lib/jvm/java-1.4.2-gcj-1.4.2.0/jre --with-cpu=generic
--host=x86_64-redhat-linux

Thread model: posix

gcc version 4.1.2 20080704 (Red Hat 4.1.2-44)

The second machine is SUSE Enterprise Server 11 with GCC 4.3.4:

Target: x86_64-suse-linux

Configured with: ../configure --prefix=/usr --infodir=/usr/share/info
--mandir=/usr/share/man --libdir=/usr/lib64 --libexecdir=/usr/lib64
--enable-languages=c,c++,objc,fortran,obj-c++,java,ada
--enable-checking=release --with-gxx-include-dir=/usr/include/c++/4.3
--enable-ssp --disable-libssp
--with-bugurl=http://bugs.opensuse.org/--with-pkgversion='SUSE Linux'
--disable-libgcj --disable-libmudflap
--with-slibdir=/lib64 --with-system-zlib --enable-__cxa_atexit
--enable-libstdcxx-allocator=new --disable-libstdcxx-pch
--enable-version-specific-runtime-libs --program-suffix=-4.3
--enable-linux-futex --without-system-libunwind --with-cpu=generic
--build=x86_64-suse-linux

Thread model: posix

gcc version 4.3.4 [gcc-4_3-branch revision 152973] (SUSE Linux)

p.s. 3: ldd Informantion

The first machine:
$ ldd my_hybrid_app
        libm.so.6 => /lib64/libm.so.6 (0x000000358d400000)
        libmpi.so.0 => /usr/local/openmpi/lib/libmpi.so.0
(0x00002af0d53a7000)
        libopen-rte.so.0 => /usr/local/openmpi/lib/libopen-rte.so.0
(0x00002af0d564a000)
        libopen-pal.so.0 => /usr/local/openmpi/lib/libopen-pal.so.0
(0x00002af0d5895000)
        libdl.so.2 => /lib64/libdl.so.2 (0x000000358d000000)
        libnsl.so.1 => /lib64/libnsl.so.1 (0x000000358f000000)
        libutil.so.1 => /lib64/libutil.so.1 (0x000000359a600000)
        libgomp.so.1 => /usr/lib64/libgomp.so.1 (0x00002af0d5b07000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x000000358d800000)
        libc.so.6 => /lib64/libc.so.6 (0x000000358cc00000)
        /lib64/ld-linux-x86-64.so.2 (0x000000358c800000)
        librt.so.1 => /lib64/librt.so.1 (0x000000358dc00000)
The second machine:
$ ldd my_hybrid_app
        linux-vdso.so.1 => (0x00007fff3eb5f000)
        libmpi.so.0 => /root/opt/openmpi/lib/libmpi.so.0
(0x00007f68627a1000)
        libm.so.6 => /lib64/libm.so.6 (0x00007f686254b000)
        libopen-rte.so.0 => /root/opt/openmpi/lib/libopen-rte.so.0
(0x00007f68622fc000)
        libopen-pal.so.0 => /root/opt/openmpi/lib/libopen-pal.so.0
(0x00007f68620a5000)
        libdl.so.2 => /lib64/libdl.so.2 (0x00007f6861ea1000)
        libnsl.so.1 => /lib64/libnsl.so.1 (0x00007f6861c89000)
        libutil.so.1 => /lib64/libutil.so.1 (0x00007f6861a86000)
        libgomp.so.1 => /usr/lib64/libgomp.so.1 (0x00007f686187d000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f6861660000)
        libc.so.6 => /lib64/libc.so.6 (0x00007f6861302000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f6862a58000)
        librt.so.1 => /lib64/librt.so.1 (0x00007f68610f9000)
I installed openmpi-1.4.2 to a user directory /root/opt/openmpi and use
"-L/root/opt/openmpi -Wl,-rpath,/root/opt/openmpi" when linking.

-- 
Huiwei Lv
PhD. student at Institute of Computing Technology,
Beijing, China
http://asg.ict.ac.cn/lhw