Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: [OMPI users] hello_f90.f90(17): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_INIT]
From: ÖÓ³É (ggdhzdx_at_[hidden])
Date: 2012-09-09 00:01:26


Hi
I want to install a 64-bit vision openmpi on a cluster. This cluster already have a 32-bit openmpi-1.4.3 installed on it. I'm not administrator. So I can't uninstall the previous version. an I installed the 64 bit openmpi in my homo directory.
the configure and make step seem errorless.
when I set the environment variables and use the new installed openmpi(version 1.6.1, 1.4.3, 1.5.5 tested) to compile and run example files. I get the following error:




**********************************************************************************************
zhongc_at_node100:~/openmpi-1.4.5-install/examples> make
mpicc -g hello_c.c -o hello_c
mpicc -g ring_c.c -o ring_c
mpicc -g connectivity_c.c -o connectivity_c
make[1]: Entering directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
mpic++ -g hello_cxx.cc -o hello_cxx
mpic++ -g ring_cxx.cc -o ring_cxx
make[1]: Leaving directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
make[1]: Entering directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
mpif77 -g hello_f77.f -o hello_f77
mpif77 -g ring_f77.f -o ring_f77
make[1]: Leaving directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
make[1]: Entering directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
mpif90 -g hello_f90.f90 -o hello_f90
hello_f90.f90(17): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_INIT]
    call MPI_INIT(ierr)
---------^
hello_f90.f90(18): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_COMM_RANK]
    call MPI_COMM_RANK(MPI_COMM_WORLD, rank, ierr)
---------^
hello_f90.f90(19): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_COMM_SIZE]
    call MPI_COMM_SIZE(MPI_COMM_WORLD, size, ierr)
---------^
hello_f90.f90(21): error #6285: There is no matching specific subroutine for this generic subroutine call. [MPI_FINALIZE]
    call MPI_FINALIZE(ierr)
---------^
compilation aborted for hello_f90.f90 (code 1)
make[1]: *** [hello_f90] Error 1
make[1]: Leaving directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
make: *** [all] Error 2



*****************************************************************************************************




and if i try to run the comiled program I will get the following result£º
****************************************************************************************************
zhongc_at_node100:~/openmpi-1.4.5-install/examples> mpirun -np 4 hello_f77
 Hello, world, I am 0 of 4
 Hello, world, I am 0 of 4
 Hello, world, I am 0 of 4
 Hello, world, I am 0 of 4
 zhongc_at_node100:~/openmpi-1.4.5-install/examples> mpirun -np 4 ring_c
Process 0 sending 10 to 1, tag 201 (4 processes in ring)
Process 0 sent to 1
Process 0 decremented value: 9
Process 0 decremented value: 8
Process 0 decremented value: 7
Process 0 decremented value: 6
Process 0 decremented value: 5
Process 0 decremented value: 4
Process 0 decremented value: 3
Process 0 decremented value: 2
Process 0 decremented value: 1
Process 0 decremented value: 0
Process 0 exiting
Process 1 exiting
Process 2 exiting
Process 3 exiting
*************************************************************************************************


following are my environment set in ~/.bashrc
****************************************************************
export MPI_ROOT=/dawnfs/users/zhongc/openmpi-1.4.5
export LD_LIBRARY_PATH=/dawnfs/users/zhongc/openmpi-1.4.5/lib:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/dawnfs/users/zhongc/openmpi-1.4.5:$LD_LIBRARY_PATH
export OMPI_MPIF77="ifort"
export OMPI_MPIFC="ifort"
export OMPI_MPICC="icc"
export OMPI_MPICXX="icc"
export PATH=$MPI_ROOT/bin:$PATH

******************************************************************


if I use the openmpi that originally installed in this cluster. every thing is fine:
***********************************************************************************
zhongc_at_node100:~/openmpi-1.4.5-install/examples> which mpirun
/dawnfs/software/mpi/openmpi1.4.3-intel/bin/mpirun
zhongc_at_node100:~/openmpi-1.4.5-install/examples> make clean
rm -f hello_c hello_cxx hello_f77 hello_f90 ring_c ring_cxx ring_f77 ring_f90 connectivity_c *~ *.o
zhongc_at_node100:~/openmpi-1.4.5-install/examples> make
mpicc -g hello_c.c -o hello_c
mpicc -g ring_c.c -o ring_c
mpicc -g connectivity_c.c -o connectivity_c
make[1]: Entering directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
mpic++ -g hello_cxx.cc -o hello_cxx
mpic++ -g ring_cxx.cc -o ring_cxx
make[1]: Leaving directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
make[1]: Entering directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
mpif77 -g hello_f77.f -o hello_f77
mpif77 -g ring_f77.f -o ring_f77
make[1]: Leaving directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
make[1]: Entering directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
mpif90 -g hello_f90.f90 -o hello_f90
mpif90 -g ring_f90.f90 -o ring_f90
make[1]: Leaving directory `/dawnfs/users/zhongc/openmpi-1.4.5-install/examples'
zhongc_at_node100:~/openmpi-1.4.5-install/examples> mpirun -np 4 hello_f77
 Hello, world, I am 1 of 4
 Hello, world, I am 0 of 4
 Hello, world, I am 2 of 4
 Hello, world, I am 3 of 4
zhongc_at_node100:~/openmpi-1.4.5-install/examples> mpirun -np 4 ring_c
Process 0 sending 10 to 1, tag 201 (4 processes in ring)
Process 0 sent to 1
Process 1 exiting
Process 2 exiting
Process 3 exiting
Process 0 decremented value: 9
Process 0 decremented value: 8
Process 0 decremented value: 7
Process 0 decremented value: 6
Process 0 decremented value: 5
Process 0 decremented value: 4
Process 0 decremented value: 3
Process 0 decremented value: 2
Process 0 decremented value: 1
Process 0 decremented value: 0
Process 0 exiting

********************************************************************************


version I have tried 1.4.5, 1.6.1. 1.5.5
here are the configures I have tried
**************************************************
--prefix=/dawnfs/users/zhongc/openmpi-1.4.5 CXX=icpc CC=icc F77=ifort FC=ifort FFLAGS=-i8 FCFLAGS=-i8
--prefix=/dawnfs/users/zhongc/openmpi-1.4.5 CXX=icpc CC=icc F77=ifort FC=ifort FFLAGS=-i8 FCFLAGS=-i8 --without-tm --without-lsf
--prefix=/dawnfs/users/zhongc/openmpi-1.4.5 CXX=icpc CC=icc F77=ifort FC=ifort FFLAGS=-i8 FCFLAGS=-i8 --enable-static --disable-shared --with-mpi-f90-size=large
--prefix=/dawnfs/users/zhongc/openmpi-1.4.5 CXX=g++ CC=gcc F77=gfortran FC=gfortran FFLAGS="-m64 -fdefault-integer-8" FCFLAGS="-m64 -fdefault-integer-8" CFLAGS=-m64 CXXFLAGS=-m64

*********************************************************


gcc or icc i have used
**********************************************************************************
zhongc_at_node100:~/openmpi-1.4.5-install> gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/dawnfs/users/zhongc/gcc-4.7.0/libexec/gcc/x86_64-unknown-linux-gnu/4.7.0/lto-wrapper
Target: x86_64-unknown-linux-gnu
Configured with: ../configure --prefix=/dawnfs/users/zhongc/gcc-4.7.0
Thread model: posix
gcc version 4.7.0 (GCC)
zhongc_at_node100:~/openmpi-1.4.5-install> icc -V
Intel(R) C Intel(R) 64 Compiler XE for applications running on Intel(R) 64, Version 12.0.3.174 Build 20110309
Copyright (C) 1985-2011 Intel Corporation. All rights reserved.
FOR NON-COMMERCIAL USE ONLY
******************************************************************************



the ompi_info --all output of system openmpi (name old-ompi-info) and my openmpi (name new-ompi-info) and config.log are in the attachment


this have tortured me for several days, can anybody helpme ?

gICAgyo