Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: Brock Palen (brockp_at_[hidden])
Date: 2007-04-10 11:18:52


I remember trying to build HPL my first time as a student :-) Its
much simpler than you think It also is a great way to learn about
linking and make files. (Well in my opinion).

See my comments below, mostly quickly looking is you should change
g77 and gcc to mpicc.
Yes mpicc calls gcc you can see by running

mpicc -showme

Thus you can ommit the MPdir, MPinc, and MPlib (No libmpich.a !!!!!)
You can see one I use to ACML on our cluster, note we use PGI
compilers.


Brock Palen
Center for Advanced Computing
brockp_at_[hidden]
(734)936-1985

On Apr 10, 2007, at 12:10 AM, snj78_at_[hidden] wrote:

> Hello all,
>
> I am trying to configure HPL on a beowulf cluster that I have put
> together for a senior project at Weber State University, and I am
> having a little bit of trouble. First of all, about the cluster:
>
> 4-node diskless cluster
> Fedora Core 6 - 64 bit version
> Intel Pentium D dual core processors
> MSI 965 motherboards
>
> Right now I have one node doing a net boot with syslinux and would
> like to do a benchmark before I put the rest of the nodes together,
> so I can have a performance comparison with subsequent nodes. I
> have installed the following packages on my system for HPL:
>
> openmpi-1.1-7.fc6.x86_64.rpm
> openmpi-devel-1.1-7.fc6.x86_64.rpm
> openmpi-libs-1.1-7.fc6.x86_64.rpm
> lapack-3.1.0-4.fc6.x86_64.rpm
> Blas-3.1.0-4.fc6.x86_64.rpm
> atlas-4.6.0-11.fc6.x86_64.rpm
> cblas.tgz
> hpl.tgz
>
> I may have installed more packages than necessary but I didn't
> think it would hurt. Everything has installed successfully but I
> can't get the makefile.<arch> down. I simply just don't understand
> enough of it to build it correctly. I just keep getting 'Make.inc'
> errors. The Makefile that I have attempted is below, called
> Make.Beowulf. I just used a generic makefile from the setups
> directory and attempted to supply some paths to the libraries but
> to no avail. I have tried to find documentation explaining more
> clearly how everything should be setup but nothing in lay-man
> terms, hence the errors. A few questions:
>
> What should my arch be? Does that even matter? Does it have to be
> x86_64?
> I realize I have to supply paths to the BLAS and MPI headers and
> libraries but exactly which libraries and header files?
> The compiler I am using is mpicc which is just linked to gcc, but
> shouldn't that compiler supply the links to the correct libraries
> and header files?
> The MPlib parameter points to libmpich.a so I installed mpich2 but
> that didn't give me a libmpich.a directory so what should I use there?
> Also, I am not using an network file systems so am I correct in
> assuming that all of the libraries need to be on each of the
> nodes? If so, I need to know exactly where to put them, and again,
> I believe they would need to be put into the exact same location,
> so the problem is, which libraries and header files exactly? (as
> to save precious RAM on each of the nodes).
>
> I realize I may be asking a lot but the end of the semester is just
> around the corner. I appreciate any help that you may give me
> ahead of time. Thanks.
>
> Stephen Jenkins
> snj78_at_[hidden]
>
>
> Make.Beowulf
>
> SHELL = /bin/sh
>
> #
>
> CD = cd
>
> CP = cp
>
> LN_S = ln -s
>
> MKDIR = mkdir
>
> RM = /bin/rm -f
>
> TOUCH = touch
>
> # - Platform identifier
> ------------------------------------------------
>
> ARCH = Linux_x86_64
>
> # - HPL Directory Structure / HPL library
> ------------------------------
>
> TOPdir = $(HOME)/hpl
>
> INCdir = $(TOPdir)/include
>
> BINdir = $(TOPdir)/bin/$(ARCH)
>
> LIBdir = $(TOPdir)/lib/$(ARCH)
>
> #
>
> HPLlib = $(LIBdir)/libhpl.a
>
> # - Message Passing library (MPI)
> --------------------------------------
>
> MPdir = /usr/include/openmpi
>
> MPinc = -I$/usr/include/include
>
> MPlib = $(MPdir)/lib/libmpich.a
>
> # - Linear Algebra library (BLAS or VSIPL)
> -----------------------------
>
> LAdir = $(HOME)/netlib/ARCHIVES/Linux_PII
>
> LAinc =
>
> LAlib = $(LAdir)/libcblas.a $(LAdir)/libatlas.a
>
> # - F77 / C interface
> --------------------------------------------------
>
> F2CDEFS =
>
> # - HPL includes / libraries / specifics
> -------------------------------
>
> HPL_INCLUDES = -I$(INCdir) -I$(INCdir)/$(ARCH) $(LAinc) $(MPinc)
>
> HPL_LIBS = $(HPLlib) $(LAlib) $(MPlib)
>
> # - Compile time options
> -----------------------------------------------
>
> HPL_OPTS = -DHPL_CALL_CBLAS
>
> #
> ----------------------------------------------------------------------
>
> HPL_DEFS = $(F2CDEFS) $(HPL_OPTS) $(HPL_INCLUDES)
>
> # - Compilers / linkers - Optimization flags
> ---------------------------
>
> CC = /usr/bin/gcc
Change this to mpicc
> CCNOOPT = $(HPL_DEFS)
>
> CCFLAGS = $(HPL_DEFS) -fomit-frame-pointer -O3 -funroll-loops
>
> # On some platforms, it is necessary to use the Fortran linker to find
>
> # the Fortran internals used in the BLAS library.
>
> LINKER = /usr/bin/g77
Why? Also make this mpicc
> LINKFLAGS = $(CCFLAGS)
>
> #
>
> ARCHIVER = ar
>
> ARFLAGS = r
>
> RANLIB = echo
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users