Look around a little in those areas - I can't pretend to understand where you put them, or if there are copy/paste errors into this thread. But obviously OMPI -thinks- the libs are somewhere in there.
On Apr 6, 2011, at 2:18 PM, Nehemiah Dacres wrote:
[jian@therock lib]$ ls lib64/*.a lib64/libotf.a lib64/libvt.fmpi.a lib64/libvt.omp.a lib64/libvt.a lib64/libvt.mpi.a lib64/libvt.ompi.a last time i linked one of those files it told me they were in the wrong format. these are in archive format, what format should they be in?
because that doesn't exist /opt/SUNWhpc-O/HPC8.2.1c/sun/lib/libotf.a /opt/SUNWhpc-O/HPC8.2.1c/sun/lib/libvt.fmpi.a /opt/SUNWhpc-O/HPC8.2.1c/sun/lib/libvt.omp.a
/opt/SUNWhpc-O/HPC8.2.1c/sun/lib/libvt.a /opt/SUNWhpc-O/HPC8.2.1c/sun/lib/libvt.mpi.a /opt/SUNWhpc-O/HPC8.2.1c/sun/lib/libvt.ompi.a
is what I have for listing *.a in the lib directory. none of those are equivilant becasue they are all linked with vampire trace if I am reading the names right. I've already tried putting /opt/SUNWhpc-O/HPC8.2.1c/sun/lib/libvt.mpi.a for this and it didnt work giving errors like
Something looks fishy about your numbers. The first two sets of
numbers look the same and the last set do look better for the most
part. Your mpirun command line looks weird to me with the "-mca
orte_base_help_aggregate btl,openib,self," did something
get chopped off with the text copy? You should have had a "-mca btl
openib,self". Can you do a run with "-mca btl tcp,self", it should
I really wouldn't have expected another compiler over IB to be that
dramatically lower performing.
On 04/06/2011 12:40 PM, Nehemiah Dacres wrote:
also, I'm not sure if I'm reading the results right.
According to the last run, did using the sun compilers (update 1
) result in higher performance with sunct?
some tests I did. I hope this isn't an
abuse of the list. please tell me if it is but thanks to all
those who helped me.
this goes to say that the sun
MPI works with programs not compiled with sunís compilers. this first test was run as a base
case to see if MPI works., the sedcond run is to see the
speed up using OpenIB provides jian@therock ~]$ mpirun
/opt/iba/src/mpi_apps/mpi_stress/mpi_stress Start mpi_stress at Wed Apr 6
Iteration 0 : errors = 0, total =
0 (511 secs, Wed Apr 6 11:15:37 2011) After 1 iteration(s), 8 mins and
31 secs, total errors = 0 compiled with the sun compilers i
think [jian@therock ~]$ mpirun -mca
orte_base_help_aggregate btl,openib,self, -machinefile list
sunMpiStress Start mpi_stress at Wed Apr 6