Open MPI logo

Open MPI Development Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Development mailing list

Subject: Re: [OMPI devel] [OMPI users] Error in openmpi-1.9a1r29179
From: Siegmar Gross (Siegmar.Gross_at_[hidden])
Date: 2013-09-18 06:18:03


Hello Josh,

thank you very much for your help. Unfortunately I have still a
problem to build Open MPI.

> I pushed a bunch of fixes, can you please try now.

I tried to build /openmpi-1.9a1r29197 on my platforms and now I get
on all platforms the following error.

linpc1 openmpi-1.9a1r29197-Linux.x86_64.64_cc 117 tail -22 log.make.Linux.x86_64.64_cc
  CC base/memheap_base_alloc.lo
"../../../../openmpi-1.9a1r29197/opal/include/opal/sys/amd64/atomic.h", line 136: warning: parameter in inline asm statement unused: %3
"../../../../openmpi-1.9a1r29197/opal/include/opal/sys/amd64/atomic.h", line 182: warning: parameter in inline asm statement unused: %2
"../../../../openmpi-1.9a1r29197/opal/include/opal/sys/amd64/atomic.h", line 203: warning: parameter in inline asm statement unused: %2
"../../../../openmpi-1.9a1r29197/opal/include/opal/sys/amd64/atomic.h", line 224: warning: parameter in inline asm statement unused: %2
"../../../../openmpi-1.9a1r29197/opal/include/opal/sys/amd64/atomic.h", line 245: warning: parameter in inline asm statement unused: %2
"../../../../openmpi-1.9a1r29197/opal/include/opal/sys/atomic_impl.h", line 167: warning: statement not reached
"../../../../openmpi-1.9a1r29197/opal/include/opal/sys/atomic_impl.h", line 192: warning: statement not reached
"../../../../openmpi-1.9a1r29197/opal/include/opal/sys/atomic_impl.h", line 217: warning: statement not reached
"../../../../openmpi-1.9a1r29197/oshmem/mca/spml/spml.h", line 76: warning: anonymous union declaration
"../../../../openmpi-1.9a1r29197/oshmem/mca/memheap/base/memheap_base_alloc.c", line 112: warning: argument mismatch
"../../../../openmpi-1.9a1r29197/oshmem/mca/memheap/base/memheap_base_alloc.c", line 119: warning: argument mismatch
"../../../../openmpi-1.9a1r29197/oshmem/mca/memheap/base/memheap_base_alloc.c", line 124: warning: argument mismatch
"../../../../openmpi-1.9a1r29197/oshmem/mca/memheap/base/memheap_base_alloc.c", line 248: warning: pointer to void or function used in arithmetic
"../../../../openmpi-1.9a1r29197/oshmem/mca/memheap/base/memheap_base_alloc.c", line 286: syntax error before or at: |
"../../../../openmpi-1.9a1r29197/oshmem/mca/memheap/base/memheap_base_alloc.c", line 300: warning: pointer to void or function used in arithmetic
cc: acomp failed for ../../../../openmpi-1.9a1r29197/oshmem/mca/memheap/base/memheap_base_alloc.c
make[2]: *** [base/memheap_base_alloc.lo] Error 1
make[2]: Leaving directory `/export2/src/openmpi-1.9/openmpi-1.9a1r29197-Linux.x86_64.64_cc/oshmem/mca/memheap'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/export2/src/openmpi-1.9/openmpi-1.9a1r29197-Linux.x86_64.64_cc/oshmem'
make: *** [all-recursive] Error 1

Kind regards

Siegmar

> -----Original Message-----
> From: Jeff Squyres (jsquyres) [mailto:jsquyres_at_[hidden]]
> Sent: Tuesday, September 17, 2013 6:37 AM
> To: Siegmar Gross; Open MPI Developers List
> Cc: Joshua Ladd
> Subject: Re: [OMPI users] Error in openmpi-1.9a1r29179
>
> ...moving over to the devel list...
>
> Dave and I looked at this during a break in the EuroMPI conference, and noticed several things:
>
> 1. Some of the shmem interfaces are functions (i.e., return non-void) and some are subroutines (i.e., return void). They're currently all using a single macro
to declare the interfaces, which assume functions. So this macro is incorrect for subroutines -- you really need 2 macros.
>
> 2. The macro name is OMPI_GENERATE_FORTRAN_BINDINGS -- why isn't is SHMEM_GENERATE_FORTRAN_BINDINGS?
>
> 3. I notice that none of the Fortran interfaces are prototyped in shmem.fh. Why not? A shmem person here in Madrid mentioned that there is supposed to be a
shmem.fh file and a shmem modulefile.
>
>
> On Sep 17, 2013, at 8:49 AM, Siegmar Gross <Siegmar.Gross_at_[hidden]> wrote:
>
> > Hi,
> >
> > I tried to install openmpi-1.9a1r29179 on "openSuSE Linux 12.1",
> > "Solaris 10 x86_64", and "Solaris 10 sparc" with "Sun C 5.12" in
> > 64-bit mode. Unfortunately "make" breaks with the same error on all
> > platforms.
> >
> > tail -15 log.make.Linux.x86_64.64_cc
> >
> > CCLD libshmem_c.la
> > make[3]: Leaving directory `/export2/src/openmpi-1.9/openmpi-1.9a1r29179-Linux.x86_64.64_cc/oshmem/shmem/c'
> > make[2]: Leaving directory `/export2/src/openmpi-1.9/openmpi-1.9a1r29179-Linux.x86_64.64_cc/oshmem/shmem/c'
> > Making all in shmem/fortran
> > make[2]: Entering directory `/export2/src/openmpi-1.9/openmpi-1.9a1r29179-Linux.x86_64.64_cc/oshmem/shmem/fortran'
> > CC start_pes_f.lo
> > "../../../../openmpi-1.9a1r29179/oshmem/shmem/fortran/start_pes_f.c",
> > line 16: void function cannot return value
> > "../../../../openmpi-1.9a1r29179/oshmem/shmem/fortran/start_pes_f.c",
> > line 16: void function cannot return value
> > "../../../../openmpi-1.9a1r29179/oshmem/shmem/fortran/start_pes_f.c",
> > line 16: void function cannot return value
> > cc: acomp failed for
> > ../../../../openmpi-1.9a1r29179/oshmem/shmem/fortran/start_pes_f.c
> > make[2]: *** [start_pes_f.lo] Error 1
> > make[2]: Leaving directory `/export2/src/openmpi-1.9/openmpi-1.9a1r29179-Linux.x86_64.64_cc/oshmem/shmem/fortran'
> > make[1]: *** [all-recursive] Error 1
> > make[1]: Leaving directory `/export2/src/openmpi-1.9/openmpi-1.9a1r29179-Linux.x86_64.64_cc/oshmem'
> > make: *** [all-recursive] Error 1
> >
> >
> > I configured with the following command.
> >
> > ../openmpi-1.9a1r29179/configure --prefix=/usr/local/openmpi-1.9_64_cc
> > \
> > --libdir=/usr/local/openmpi-1.9_64_cc/lib64 \
> > --with-jdk-bindir=/usr/local/jdk1.7.0_07-64/bin \
> > --with-jdk-headers=/usr/local/jdk1.7.0_07-64/include \
> > JAVA_HOME=/usr/local/jdk1.7.0_07-64 \ LDFLAGS="-m64" \ CC="cc"
> > CXX="CC" FC="f95" \ CFLAGS="-m64" CXXFLAGS="-m64 -library=stlport4"
> > FCFLAGS="-m64" \ CPP="cpp" CXXCPP="cpp" \ CPPFLAGS="" CXXCPPFLAGS=""
> > \ --enable-mpi-cxx \ --enable-cxx-exceptions \ --enable-mpi-java \
> > --enable-heterogeneous \ --enable-opal-multi-threads \
> > --enable-mpi-thread-multiple \ --with-threads=posix \
> > --with-hwloc=internal \ --without-verbs \ --without-udapl \
> > --without-sctp \
> > --with-wrapper-cflags=-m64 \
> > --enable-debug \
> > |& tee log.configure.$SYSTEM_ENV.$MACHINE_ENV.64_cc
> >
> >
> >
> > I would be grateful if somebody can fix the bug. Thank you very much
> > for any help in advance.
> >
> >
> > Kind regards
> >
> > Siegmar
> >
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
>
>
> --
> Jeff Squyres
> jsquyres_at_[hidden]
> For corporate legal information go to: http://www.cisco.com/web/about/doing_business/legal/cri/
>
>