hehe...don't we all love it when a problem "fixes" itself. I was missing a line in my Type creation to realigne the elements correctly:
// Displacement is RELATIVE to it's first structure element!
for(i=2; i >= 0; i--) Displacement[i] -= Displacement;
I'm attaching the functionnal code so that others can maybe see this one as an example ;)
Le mercredi 4 avril 2007 11:47, Eric Thibodeau a écrit :
> Hello all,
> First off, please excuse the attached code as I may be naÃ¯ve in my attempts to implement my own MPI_OP.
> I am attempting to create my own MPI_OP to use with MPI_Allreduce. I have been able to find very little examples off the net of creating MPI_OPs. My present references are "MPI The complete reference Volume 1 2nd edition" and some rather good slides I found at http://www.mpi-hd.mpg.de/personalhomes/stiff/MPI/ . I am attaching my "proof of concept" code which fails with:
> [kyron:14074] *** Process received signal ***
> [kyron:14074] Signal: Segmentation fault (11)
> [kyron:14074] Signal code: Address not mapped (1)
> [kyron:14074] Failing at address: 0x801da600
> [kyron:14074] [ 0] [0x6ffa6440]
> [kyron:14074] [ 1] /home/kyron/openmpi_i686/lib/openmpi/mca_coll_tuned.so(ompi_coll_tuned_allreduce_intra_recursivedoubling+0x700) [0x6fbb0dd0]
> [kyron:14074] [ 2] /home/kyron/openmpi_i686/lib/openmpi/mca_coll_tuned.so(ompi_coll_tuned_allreduce_intra_dec_fixed+0xb2) [0x6fbae9a2]
> [kyron:14074] [ 3] /home/kyron/openmpi_i686/lib/libmpi.so.0(PMPI_Allreduce+0x1a6) [0x6ff61e86]
> [kyron:14074] [ 4] AllReduceTest(main+0x180) [0x8048ee8]
> [kyron:14074] [ 5] /lib/libc.so.6(__libc_start_main+0xe3) [0x6fcbd823]
> [kyron:14074] *** End of error message ***
> Eric Thibodeau
Neural Bucket Solutions Inc.