Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] Install BLACS and ScaLAPACK on Leopard
From: Gregory John Orris (gregory.orris_at_[hidden])
Date: 2008-05-07 18:39:48


All,
I took this discussion off line since it's now not relevant to openmpi
directly, but since I started it .... to get a truly 64 bit ATLAS lib
under leopard you need to configure it with the -b 64 -Fa ac "-L/usr/
local/lib/x86_64 -m64"
Doing otherwise will not lead to a 64-bit file.
'Nuf Said!
Greg

On May 7, 2008, at 6:20 PM, Linwei Wang wrote:

> Yeah..I built it...but I dun know why it is of the wrong
> architecture....
>
>
> On May 7, 2008, at 6:07 PM, Doug Reeder wrote:
>
>> Linwei,
>>
>> Did you build the liblapack.a file, it is of the wrong architecture.
>>
>> Doug Reeder
>> On May 7, 2008, at 2:58 PM, Linwei Wang wrote:
>>
>>> Hi, Doug
>>>
>>> I've checked the makefiles and make sure that flag -m64 is used for
>>> all the compiling
>>> but the error still exists..
>>>
>>>
>>> Linwei
>>>
>>> On May 7, 2008, at 5:33 PM, Doug Reeder wrote:
>>>
>>>> Linwei,
>>>>
>>>> It looks like you are getting a mix of 32 and 64 bit code (hence
>>>> the
>>>> 'file is not of required architecture' error). Are you using the
>>>> command line flag -m64 for some parts of the build and not for
>>>> others. You need to use either -m32 or -m64 for all the builds.
>>>>
>>>> Doug Reeder
>>>> On May 7, 2008, at 2:25 PM, Linwei Wang wrote:
>>>>
>>>>> Dear sir,
>>>>>
>>>>> Thanks very much for your detailed guideline~
>>>>> I'm now trying to follow it out~
>>>>> I've installed gcc 4.3 & openmpi~
>>>>> When compiling CLAPACK, I'm trying to use the optimized BLAS
>>>>> library by ATLAS, so I set the BLASLIB in the make.inc as:
>>>>> BLASLIB = ../../libcblaswr.a -lcblas -latlas
>>>>> then build the libraries (also before that, I built the f2clib
>>>>> following the guideline in netlib
>>>>> It went well, but when I tried to built the blas testing code, it
>>>>> generates errors for "undefined symbols"
>>>>> looks like those should be in the f2clib, but I already built
>>>>> it....
>>>>> "gcc sblat2.o \
>>>>> ../../F2CLIBS/libf2c.a -lm -o ../xblat2s
>>>>> Undefined symbols:
>>>>> "_f2c_ssbmv", referenced from:
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schk2_ in sblat2.o
>>>>> "_f2c_sgbmv", referenced from:
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schke_ in sblat2.o
>>>>> _schk1_ in sblat2.o
>>>>> ......."
>>>>>
>>>>> On the other side, when compiling ATLAS, I did the configure as
>>>>> you
>>>>> said and "make build" went well.
>>>>> But when I tried "make check" for testing, it again give errors
>>>>> for
>>>>> "undefined symbols"...
>>>>>
>>>>> "d: warning in /Users/maomaowlw/ATLAS/build/lib/liblapack.a, file
>>>>> is
>>>>> not of required architecture
>>>>> Undefined symbols:
>>>>> "_ATL_slauum", referenced from:
>>>>> _test_inv in sinvtst.o
>>>>> "_ATL_strtri", referenced from:
>>>>> _test_inv in sinvtst.o
>>>>> "_ATL_spotrf", referenced from:
>>>>> _test_inv in sinvtst.o
>>>>> "_ATL_sgetrf", referenced from:
>>>>> _test_inv in sinvtst.o
>>>>> "_ATL_sgetri", referenced from:
>>>>> _test_inv in sinvtst.o
>>>>> "
>>>>>
>>>>> I'm not sure where is the problem? Can you provide any help?
>>>>>
>>>>> Thanks again!
>>>>>
>>>>> Linwei
>>>>>
>>>>>
>>>>> On May 6, 2008, at 11:11 AM, Gregory John Orris wrote:
>>>>>
>>>>>> Points to clarify if I may, having gone through this relatively
>>>>>> recently:
>>>>>> g77 and gfortran are NOT one and the same.
>>>>>> gfortran from sourceforge works well, but it is based on gnu gcc
>>>>>> 4.3
>>>>>> and not on the gnu gcc 4.0.1 that comes with Leopard.
>>>>>> Your best bet is to download the ENTIRE gcc package from
>>>>>> sourceforge
>>>>>> and install it into /usr/local. This includes gcc, g++, and
>>>>>> gfortran.
>>>>>>
>>>>>> Then you will need to do a number of things to actually get a
>>>>>> reliable
>>>>>> set of packages all compiled from the same version of gcc 4.3.
>>>>>> Why? Because 4.3 seems to be notoriously faster. AND, I had a
>>>>>> lot of
>>>>>> problems integrating the 4.0.1 libs with the 4.3 libs without
>>>>>> errors
>>>>>> 1. download CLAPACK-3.1.1 from netlib And compile
>>>>>> 2. Download ATLAS-1.8 from dourceforge (netlib is a little behind
>>>>>> here) and configure it with the --with-netlib-lapack=your just
>>>>>> compiled lapack from CLAPACK
>>>>>> 3. Download OpenMPI 1.2.6 and install it also so that openMPI
>>>>>> will
>>>>>> have the fortran not installed with Leopard.
>>>>>> 4. NOW you can compile BLACS and ScaLAPACK
>>>>>>
>>>>>> In all of this you will need to do a couple of additional things
>>>>>> like
>>>>>> set the env's
>>>>>> setenv LDFLAGS "-L/usr/local/lib/x86_64"
>>>>>> setenv DYLD_LIBRARY_PATH "your openmpi path"
>>>>>> setenv LD_LIBRARY_PATH "your openmpi path"
>>>>>>
>>>>>> Do all this right and make sure you compile with the -m64 -
>>>>>> mtune=core2
>>>>>> flags and you will be golden.
>>>>>>
>>>>>> So what will you have---
>>>>>> A new cblas, atlas, lapack, openmpi, fortran, c, c++, blacs, and
>>>>>> scalapack.
>>>>>> All on the same version of gnu c.
>>>>>>
>>>>>> Alternatively you can buy and use the intel compiler. It is
>>>>>> significantly faster than gfortran, but it has a host of other
>>>>>> problems associated with it.
>>>>>> But if you follow the outline above, you will be left with the
>>>>>> best
>>>>>> that's available. I have lots more info on this, but time is
>>>>>> short.
>>>>>>
>>>>>> FINALLY, and this is important, DO NOT FORGET ABOUT THE small
>>>>>> STACK
>>>>>> size on Mac's when using gfortran. It's so small that it's
>>>>>> useless
>>>>>> for
>>>>>> large parallel jobs.
>>>>>>
>>>>>>
>>>>>> On May 6, 2008, at 10:09 AM, Jeff Squyres wrote:
>>>>>>
>>>>>>> FWIW, I'm not a fortran expert, but if you built your Fortran
>>>>>>> libraries with g77 and then tried to link against them with
>>>>>>> gfortran,
>>>>>>> you might run into problems.
>>>>>>>
>>>>>>> My advice would be to use a single fortran compiler for building
>>>>>>> everything: Open MPI, your libraries, your apps. I prefer
>>>>>>> gfortran
>>>>>>> because it's more modern, but I have not done any performance
>>>>>>> evaluations of gfortran vs. g77 -- I have heard [unverified]
>>>>>>> anecdotes
>>>>>>> that gfortran is "slower" than g77 -- google around and see what
>>>>>>> the
>>>>>>> recent buzz is.
>>>>>>>
>>>>>>> FWIW: I tend to use the gnu suite from http://
>>>>>>> hpc.sourceforge.net/ --
>>>>>>> it contains pre-built gcc/g++/gfortran binaries and libraries
>>>>>>> for
>>>>>>> Leopard.
>>>>>>>
>>>>>>>
>>>>>>> On May 5, 2008, at 2:59 PM, Linwei Wang wrote:
>>>>>>>
>>>>>>>> Dear Reeder,
>>>>>>>>
>>>>>>>> It does not work. I do think they are from the fortran programs
>>>>>>>> I'm using (they are files included from the BLACS installation
>>>>>>>> package, not written by my own.
>>>>>>>>
>>>>>>>> The thing is last time when I was using g77, it caused no
>>>>>>>> problem...
>>>>>>>>
>>>>>>>> thanks for your help.
>>>>>>>>
>>>>>>>> Linwei.
>>>>>>>>
>>>>>>>> On May 5, 2008, at 2:33 PM, Doug Reeder wrote:
>>>>>>>>
>>>>>>>>> _s_wsle, _e_wsle
>>>>>>>>
>>>>>>>> _______________________________________________
>>>>>>>> users mailing list
>>>>>>>> users_at_[hidden]
>>>>>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Jeff Squyres
>>>>>>> Cisco Systems
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> users mailing list
>>>>>>> users_at_[hidden]
>>>>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> users mailing list
>>>>>> users_at_[hidden]
>>>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>
>>>>> _______________________________________________
>>>>> users mailing list
>>>>> users_at_[hidden]
>>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>
>>>> _______________________________________________
>>>> users mailing list
>>>> users_at_[hidden]
>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>
>>> _______________________________________________
>>> users mailing list
>>> users_at_[hidden]
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>