Open MPI logo

Open MPI Development Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: Re: [OMPI devel] OpenMPI and R
From: Ralph Castain (rhc_at_[hidden])
Date: 2012-04-03 01:39:24


Looks like you didn't set your LD_LIBRARY_PATH to point to where OMPI was
installed, so the individual component libs couldn't be loaded. From the
below, it looks like you need to add /usr/local to your path.

On Mon, Apr 2, 2012 at 7:26 PM, Benedict Holland <
benedict.m.holland_at_[hidden]> wrote:

> Hi All,
>
> I am on ubuntu 11.10 and the only package that they have for OpenMPI is
> 1.4.3 and I noticed the latest was 1.5.5 and I decided to try to use it to
> get the Rmpi R package compiled against it. I failed. My R version is 2.15,
> the OpenMPI is 1.5.5. I compiled it and installed it but when I tried to
> compile Rmpi I get the result below. I can try to run this stuff again but
> I had to use the base packages because they worked and I need to get some
> stuff done. I can always recompile though, install and run the Rmpi package
> against the installed OpenMPI stuff. Any ideas what I need to do? Also, is
> there an Ubuntu or Debian package available with the latest and greatest
> and if not, how can I make one?
>
> Thanks,
> ~Ben
>
> checking for gcc... gcc -std=gnu99
> checking for C compiler default output file name... a.out
> checking whether the C compiler works... yes
> checking whether we are cross compiling... no
> checking for suffix of executables...
> checking for suffix of object files... o
> checking whether we are using the GNU C compiler... yes
> checking whether gcc -std=gnu99 accepts -g... yes
> checking for gcc -std=gnu99 option to accept ISO C89... none needed
> I am here /usr/local and it is OpenMPI
> Trying to find mpi.h ...
> Found in /usr/local/include
> Trying to find libmpi.so or libmpich.a ...
> Found libmpi in /usr/local/lib
> checking for openpty in -lutil... yes
> checking for main in -lpthread... yes
> configure: creating ./config.status
> config.status: creating src/Makevars
> ** Creating default NAMESPACE file
> ** libs
> gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -DPACKAGE_NAME=\"\"
> -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\"
> -DPACKAGE_BUGREPORT=\"\" -I/usr/local/include -DMPI2 -DOPENMPI -fpic
> -O3 -pipe -g -c RegQuery.c -o RegQuery.o
> gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -DPACKAGE_NAME=\"\"
> -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\"
> -DPACKAGE_BUGREPORT=\"\" -I/usr/local/include -DMPI2 -DOPENMPI -fpic
> -O3 -pipe -g -c Rmpi.c -o Rmpi.o
> gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -DPACKAGE_NAME=\"\"
> -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\"
> -DPACKAGE_BUGREPORT=\"\" -I/usr/local/include -DMPI2 -DOPENMPI -fpic
> -O3 -pipe -g -c conversion.c -o conversion.o
> gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -DPACKAGE_NAME=\"\"
> -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\"
> -DPACKAGE_BUGREPORT=\"\" -I/usr/local/include -DMPI2 -DOPENMPI -fpic
> -O3 -pipe -g -c internal.c -o internal.o
> gcc -std=gnu99 -shared -o Rmpi.so RegQuery.o Rmpi.o conversion.o
> internal.o -L/usr/local/lib -lmpi -lutil -lpthread -L/usr/lib/R/lib -lR
> installing to /usr/local/lib/R/site-library/Rmpi/libs
> ** R
> ** demo
> ** inst
> ** preparing package for lazy loading
> ** help
> *** installing help indices
> ** building package indices
> ** testing if installed package can be loaded
> [ben-Inspiron-1764:26048] mca: base: component_find: unable to open
> /usr/local/lib/openmpi/mca_paffinity_hwloc:
> /usr/local/lib/openmpi/mca_paffinity_hwloc.so: undefined symbol:
> opal_hwloc_topology (ignored)
> [ben-Inspiron-1764:26048] mca: base: component_find: unable to open
> /usr/local/lib/openmpi/mca_carto_auto_detect:
> /usr/local/lib/openmpi/mca_carto_auto_detect.so: undefined symbol:
> opal_carto_base_graph_get_host_graph_fn (ignored)
> [ben-Inspiron-1764:26048] mca: base: component_find: unable to open
> /usr/local/lib/openmpi/mca_carto_file:
> /usr/local/lib/openmpi/mca_carto_file.so: undefined symbol:
> opal_carto_base_graph_get_host_graph_fn (ignored)
> [ben-Inspiron-1764:26048] mca: base: component_find: unable to open
> /usr/local/lib/openmpi/mca_shmem_mmap:
> /usr/local/lib/openmpi/mca_shmem_mmap.so: undefined symbol: opal_show_help
> (ignored)
> [ben-Inspiron-1764:26048] mca: base: component_find: unable to open
> /usr/local/lib/openmpi/mca_shmem_posix:
> /usr/local/lib/openmpi/mca_shmem_posix.so: undefined symbol:
> opal_shmem_base_output (ignored)
> [ben-Inspiron-1764:26048] mca: base: component_find: unable to open
> /usr/local/lib/openmpi/mca_shmem_sysv:
> /usr/local/lib/openmpi/mca_shmem_sysv.so: undefined symbol: opal_show_help
> (ignored)
> --------------------------------------------------------------------------
> It looks like opal_init failed for some reason; your parallel process is
> likely to abort. There are many reasons that a parallel process can
> fail during opal_init; some of which are due to configuration or
> environment problems. This failure appears to be an internal failure;
> here's some additional information (which may only be relevant to an
> Open MPI developer):
>
> opal_shmem_base_select failed
> --> Returned value -1 instead of OPAL_SUCCESS
> --------------------------------------------------------------------------
> [ben-Inspiron-1764:26048] [[INVALID],INVALID] ORTE_ERROR_LOG: Error in
> file runtime/orte_init.c at line 79
> --------------------------------------------------------------------------
> It looks like MPI_INIT failed for some reason; your parallel process is
> likely to abort. There are many reasons that a parallel process can
> fail during MPI_INIT; some of which are due to configuration or environment
> problems. This failure appears to be an internal failure; here's some
> additional information (which may only be relevant to an Open MPI
> developer):
>
> ompi_mpi_init: orte_init failed
> --> Returned "Error" (-1) instead of "Success" (0)
> --------------------------------------------------------------------------
> *** An error occurred in MPI_Init
> *** on a NULL communicator
> *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
> [ben-Inspiron-1764:26048] Local abort before MPI_INIT completed
> successfully; not able to aggregate error messages, and not able to
> guarantee that all other processes were killed!
>
>
> _______________________________________________
> devel mailing list
> devel_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/devel
>