Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Windows CMake build problems ...
From: Shiqing Fan (fan_at_[hidden])
Date: 2010-01-26 03:57:35


Hi Charlie ,

Did you run the mpicc command in the Visual Studio Command Prompt? Is it
possible for you to call cl.exe from there? Because it sounds like the
VS compiler was not found.

Regards,
Shiqing

cjohnson_at_[hidden] wrote:
> The mpicc, mpic++ and mpicxx apparently don't work, even though the
> rest of the wrapper commands do:
>
>
> C:\prog\mon\examples>ompi_info
> Package: Open MPI Charles Johnson_at_WORK Distribution
> Open MPI: 1.4
> Open MPI SVN revision: r22285
> Open MPI release date: Dec 08, 2009
> Open RTE: 1.4
> Open RTE SVN revision: r22285
> Open RTE release date: Dec 08, 2009
> OPAL: 1.4
> OPAL SVN revision: r22285
> OPAL release date: Dec 08, 2009
> Ident string: 1.4
> Prefix: C:\Program Files\OpenMPI_v1.4-win32
> Configured architecture: x86 Windows-6.1
> Configure host: WORK
> Configured by: Charles Johnson
> Configured on: 02:27 AM Sun 01/24/2010
> Configure host: WORK
> Built by: Charles Johnson
> Built on: 02:27 AM Sun 01/24/2010
> Built host: WORK
> C bindings: yes
> C++ bindings: yes
> Fortran77 bindings: no
> Fortran90 bindings: no
> Fortran90 bindings size: na
> C compiler: cl
> C compiler absolute: cl
> C++ compiler: cl
> C++ compiler absolute: cl
> Fortran77 compiler: CMAKE_Fortran_COMPILER-NOTFOUND
> Fortran77 compiler abs: none
> Fortran90 compiler:
> Fortran90 compiler abs: none
> C profiling: yes
> C++ profiling: yes
> Fortran77 profiling: no
> Fortran90 profiling: no
> C++ exceptions: no
> Thread support: no
> Sparse Groups: no
> Internal debug support: no
> MPI parameter check: runtime
> Memory profiling support: no
> Memory debugging support: no
> libltdl support: no
> Heterogeneous support: no
> mpirun default --prefix: yes
> MPI I/O support: yes
> MPI_WTIME support: gettimeofday
> Symbol visibility support: yes
> FT Checkpoint support: yes (checkpoint thread: no)
> MCA backtrace: none (MCA v2.0, API v2.0, Component v1.4)
> MCA paffinity: windows (MCA v2.0, API v2.0, Component v1.4)
> MCA carto: auto_detect (MCA v2.0, API v2.0, Component v1.4)
> MCA maffinity: first_use (MCA v2.0, API v2.0, Component v1.4)
> MCA timer: windows (MCA v2.0, API v2.0, Component v1.4)
> MCA installdirs: windows (MCA v2.0, API v2.0, Component v1.4)
> MCA installdirs: env (MCA v2.0, API v2.0, Component v1.4)
> MCA installdirs: config (MCA v2.0, API v2.0, Component v1.4)
> MCA crs: none (MCA v2.0, API v2.0, Component v1.4)
> MCA dpm: orte (MCA v2.0, API v2.0, Component v1.4)
> MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.4)
> MCA allocator: basic (MCA v2.0, API v2.0, Component v1.4)
> MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.4)
> MCA coll: basic (MCA v2.0, API v2.0, Component v1.4)
> MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.4)
> MCA coll: self (MCA v2.0, API v2.0, Component v1.4)
> MCA coll: sm (MCA v2.0, API v2.0, Component v1.4)
> MCA coll: sync (MCA v2.0, API v2.0, Component v1.4)
> MCA mpool: rdma (MCA v2.0, API v2.0, Component v1.4)
> MCA mpool: sm (MCA v2.0, API v2.0, Component v1.4)
> MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.4)
> MCA bml: r2 (MCA v2.0, API v2.0, Component v1.4)
> MCA btl: self (MCA v2.0, API v2.0, Component v1.4)
> MCA btl: sm (MCA v2.0, API v2.0, Component v1.4)
> MCA btl: tcp (MCA v2.0, API v2.0, Component v1.4)
> MCA topo: unity (MCA v2.0, API v2.0, Component v1.4)
> MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.4)
> MCA osc: rdma (MCA v2.0, API v2.0, Component v1.4)
> MCA iof: hnp (MCA v2.0, API v2.0, Component v1.4)
> MCA iof: orted (MCA v2.0, API v2.0, Component v1.4)
> MCA iof: tool (MCA v2.0, API v2.0, Component v1.4)
> MCA oob: tcp (MCA v2.0, API v2.0, Component v1.4)
> MCA odls: process (MCA v2.0, API v2.0, Component v1.4)
> MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.4)
> MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.4)
> MCA rml: ftrm (MCA v2.0, API v2.0, Component v1.4)
> MCA rml: oob (MCA v2.0, API v2.0, Component v1.4)
> MCA routed: binomial (MCA v2.0, API v2.0, Component v1.4)
> MCA routed: linear (MCA v2.0, API v2.0, Component v1.4)
> MCA plm: process (MCA v2.0, API v2.0, Component v1.4)
> MCA errmgr: default (MCA v2.0, API v2.0, Component v1.4)
> MCA ess: env (MCA v2.0, API v2.0, Component v1.4)
> MCA ess: hnp (MCA v2.0, API v2.0, Component v1.4)
> MCA ess: singleton (MCA v2.0, API v2.0, Component v1.4)
> MCA grpcomm: basic (MCA v2.0, API v2.0, Component v1.4)
>
> C:\prog\mon\examples>mpirun --help
> mpirun (Open MPI) 1.4
>
> Usage: mpirun [OPTION]... [PROGRAM]...
> Start the given program using Open RTE
>
> -am <arg0> Aggregate MCA parameter set file list
> --app <arg0> Provide an appfile; ignore all other command line
> options
> -bind-to-board|--bind-to-board
> Whether to bind processes to specific boards
> (meaningless on 1 board/node)
> -bind-to-core|--bind-to-core
> Whether to bind processes to specific cores (the
> default)
> -bind-to-none|--bind-to-none
> Do not bind processes to cores or sockets
> -bind-to-socket|--bind-to-socket
> Whether to bind processes to sockets
> -byboard|--byboard Whether to assign processes round-robin by board
> (equivalent to bynode if only 1 board/node)
> -bycore|--bycore Alias for byslot
> -bynode|--bynode Whether to assign processes round-robin by node
> -byslot|--byslot Whether to assign processes round-robin by slot
> (the default)
> -bysocket|--bysocket Whether to assign processes round-robin by socket
> -c|-np|--np <arg0> Number of processes to run
> -cf|--cartofile <arg0>
> Provide a cartography file ..........
>
> ... and so on. But when I run mpicc, nothing good happens:
>
> C:\prog\mon\examples>mpicc hello_c.c
> --------------------------------------------------------------------------
> Sorry! You were supposed to get help about:
> no-compiler-found
> But I couldn't open the help file:
> C:\Program
> Files\OpenMPI_v1.4-win32\share\openmpi\help-opal-wrapper.txt: No such
> file or directory. Sorry!
> --------------------------------------------------------------------------
>
> Looks like the wiring to the VCC compiler is disconnected.
>
> Charlie ...
>

-- 
--------------------------------------------------------------
Shiqing Fan                          http://www.hlrs.de/people/fan
High Performance Computing           Tel.: +49 711 685 87234
  Center Stuttgart (HLRS)            Fax.: +49 711 685 65832
Address:Allmandring 30               email: fan_at_[hidden]    
70569 Stuttgart