Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: [OMPI users] Using open-mpi app on a normal network
From: Antoine Monmayrant (antoine.monmayrant_at_[hidden])
Date: 2008-01-18 11:54:36

Hi everyone,

I am new to open-mpi and parallel computing so I hope I won't
bore/offend you with obvious/off-topic questions.
We are running scientific simulations (using meep from mit) on small
bi-processors pcs and to fully use both processors on each machine, we
had to compile a mpi version of the soft.
Compiling and running the app (meep-mpi) with mpirun were both fine.
Now, we wonder if we can do a bit more by exploiting the unused
computing power that is available on our lab network during night and
The problem is that even if our network is more than decent, it not near
what you can find in a cluster. What's more, the various computers we
could use are quite different (proc, ram, overall performances).
Taking this into account, do you think we can use open-mpi over such a
 a) for one long simulation to share on the different "nodes"?
 b) for embarrassingly parallel simulations, that is for N independent
simulations that we want to "spread" over the network, for example
running one simulation on each available node?

What kind of gain/limitations can we expect for both cases?
If open-mpi is not the way forward, do you have an alternative to propose?
Thanks in advance for your help,



 Antoine Monmayrant
 7 avenue du Colonel Roche
 31077 TOULOUSE Cedex4
 Tel:+33 5 61 33 64 59
 email : antoine.monmayrant_at_[hidden]
 permanent email : antoine.monmayrant_at_[hidden]