This web mail archive is frozen.
This page is part of a frozen web archive of this mailing list.
You can still navigate around this archive, but know that no new mails
have been added to it since July of 2016.
Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.
You might want to look through the FAQ for our recommended ways to
build BLACS, etc. If you have a mismatch such that those libraries
did not build properly -- or perhaps built against a different MPI
implementation -- Bad Things can happen.
Are there any test applications with those libraries to verify that
they compiled / installed / can run properly?
On Feb 7, 2009, at 1:01 PM, Hana Milani wrote:
> > Are you able to run *any* MPI applications (especially those
> > with Fortran) in parallel? E.g., the hello world and the ring
> > programs in the examples/ subdirectory in the OMPI distribution?
> I am ruuning another code which does not need scalapack and blacs
> with openmpi directly, it has been written by fortran as well. The
> parallel run is happily going on.
> The "hello world" and the ring programs in the examples are also
> working fine!!!
> As you can see in my prev. email, in the code there's an "arch.make"
> file in which I have to mention the MPI address + scalapack, blacs
> lapack and blas library addresses. Could this "killing" originate
> from this file. Because the scalapack installer is installed
> correctly the tests are running fine and the openmpi is functioning.
> I have had enclosed arch.make in my prev. email.
> users mailing list