Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] "self scheduled" work & mpi receive???
From: Mikael Lavoie (mikael.lavoie_at_[hidden])
Date: 2010-09-23 17:07:19


Hi Ambrose,

I'm interested in you work, i have a app to convert for myself and i don't
know enough the MPI structure and syntaxe to make it...

So if you wanna share your app i'm interested in taking a look at it!!

Thanks and have a nice day!!

Mikael Lavoie
2010/9/23 Lewis, Ambrose J. <AMBROSE.J.LEWIS_at_[hidden]>

> Hi All:
>
> I’ve written an openmpi program that “self schedules” the work.
>
> The master task is in a loop chunking up an input stream and handing off
> jobs to worker tasks. At first the master gives the next job to the next
> highest rank. After all ranks have their first job, the master waits via an
> MPI receive call for the next free worker. The master parses out the rank
> from the MPI receive and sends the next job to this node. The jobs aren’t
> all identical, so they run for slightly different durations based on the
> input data.
>
>
>
> When I plot a histogram of the number of jobs each worker performed, the
> lower mpi ranks are doing much more work than the higher ranks. For
> example, in a 120 process run, rank 1 did 32 jobs while rank 119 only did 2.
> My guess is that openmpi returns the lowest rank from the MPI Recv when
> I’ve got MPI_ANY_SOURCE set and multiple sends have happened since the last
> call.
>
>
>
> Is there a different Recv call to make that will spread out the data
> better?
>
>
>
> THANXS!
>
> amb
>
>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>