Open MPI logo

Open MPI Development Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Development mailing list

From: George Bosilca (bosilca_at_[hidden])
Date: 2006-01-15 00:42:03


I get the same error on a 32 bits architecture if I use the tuned
collectives.

mpirun -np 4 -mca btl tcp,self -mca coll tuned,basic
src/MPI_Allreduce_user_c

dump tons of MPITEST error (0): i=64, int value=4, expected 1
.

If I disable the tuned module:

> mpirun -np 4 -mca btl tcp,self -mca coll basic src/MPI_Allreduce_user_c
MPITEST info (0): Starting MPI_Allreduce_user() test
MPITEST_results: MPI_Allreduce_user() all tests PASSED (7076)

the test pass without errors.

    george.

On Sat, 14 Jan 2006, David Daniel wrote:

> Hi Graham,
>
> On Jan 14, 2006, at 2:07 PM, Graham E Fagg wrote:
> > Hi all,
> > whatever this fixed/changed, I no longer get corrupted memory in the
> > tuned data segment hung off each communicator... ! I'm still testing
> > to see if I get TimPs error. G
> >
> > On Sat, 14 Jan 2006 bosilca_at_[hidden] wrote:
> >
> >> Author: bosilca
> >> Date: 2006-01-14 15:21:44 -0500 (Sat, 14 Jan 2006)
> >> New Revision: 8692
> >>
> >> Modified:
> >> trunk/ompi/mca/btl/tcp/btl_tcp_endpoint.c
> >> trunk/ompi/mca/btl/tcp/btl_tcp_endpoint.h
> >> trunk/ompi/mca/btl/tcp/btl_tcp_frag.c
> >> trunk/ompi/mca/btl/tcp/btl_tcp_frag.h
> >> Log:
> >> A better implementation for the TCP endpoint cache + few comments.
>
>
> On a 64-bit bproc/myrinet system I'm seeing Tim P's problem with the
> current head of the trunk. See attached output.
>
> David
>
>
>
>
> $ ompi_info | head
> Open MPI: 1.1a1svn01142006
> Open MPI SVN revision: svn01142006
> Open RTE: 1.1a1svn01142006
> Open RTE SVN revision: svn01142006
> OPAL: 1.1a1svn01142006
> OPAL SVN revision: svn01142006
> Prefix: /scratch/modules/opt/openmpi-trunk-
> nofortran-bproc64
> Configured architecture: x86_64-unknown-linux-gnu
> Configured by: ddd
> Configured on: Sat Jan 14 17:22:16 MST 2006
>
> $ make MPIRUN='mpirun -mca coll basic' MPI_Allreduce_user_c
> (cd src ; make MPI_Allreduce_user_c)
> make[1]: Entering directory `/home/ddd/intel_tests/src'
> mpicc -g -Isrc -c -o libmpitest.o libmpitest.c
> mpicc -g -Isrc -o MPI_Allreduce_user_c MPI_Allreduce_user_c.c
> libmpitest.o -lm
> make[1]: Leaving directory `/home/ddd/intel_tests/src'
> mpirun -mca coll basic -n 4 -- `pwd`/src/MPI_Allreduce_user_c
> MPITEST info (0): Starting MPI_Allreduce_user() test
> MPITEST_results: MPI_Allreduce_user() all tests PASSED (7076)
>
> $ make MPIRUN='mpirun' MPI_Allreduce_user_c
> (cd src ; make MPI_Allreduce_user_c)
> make[1]: Entering directory `/home/ddd/intel_tests/src'
> make[1]: `MPI_Allreduce_user_c' is up to date.
> make[1]: Leaving directory `/home/ddd/intel_tests/src'
> mpirun -n 4 -- `pwd`/src/MPI_Allreduce_user_c
> MPITEST info (0): Starting MPI_Allreduce_user() test
> MPITEST error (0): i=0, int value=4, expected 1
> MPITEST error (0): i=1, int value=4, expected 1
> MPITEST error (0): i=2, int value=4, expected 1
> MPITEST error (0): i=3, int value=4, expected 1
>
> ...
>
>
> _______________________________________________
> devel mailing list
> devel_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/devel
>

"We must accept finite disappointment, but we must never lose infinite
hope."
                                  Martin Luther King