This web mail archive is frozen.
This page is part of a frozen web archive of this mailing list.
You can still navigate around this archive, but know that no new mails
have been added to it since July of 2016.
Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.
common bashrc meant if the /home is network mounted so ignore that I guess.
Have you tried adding
. $HOME/OpenFOAM/OpenFOAM-1.5.x/etc/bashrc to your ~/.bashrc on nodes ?
This will append the configurations you need from the bashrc file located
inside the directory.
On Sat, Aug 1, 2009 at 11:09 PM, Tomislav Maric <tomislav.maric_at_[hidden]>wrote:
> Prasadcse Perera wrote:
> > Hi,
> > One workaround is you can define PATH and LD_LIBRARY_PATH in your common
> > .bashrc and have a resembling paths of installation in two nodes. This
> > works for me nicely with my three node installation :).
> Thank you very much for the advice. Actually I'm running OpenFOAM (read:
> a program parallelized to run with Open MPI) from SLAX Live DVD, so the
> installation paths are identical, as well as everything else.
> I've added commands that set enviromental variables in .bashrc on both
> nodes, but you mention "common .bashrc". Common in what way? I'm sorry
> for newbish question, again, I'm supposed to be a Mechanical Engineer.
> OpenFOAM toolkit carries a separate directory for third-party support
> software. In this directory there are programs for postprocessing
> simulation results and analyze data and Open MPI. Therefore, in my case,
> Open MPI is built in a separate directory and the build is automated.
> After the build of both programs, there is a special bashrc located in
> that sets all the variables needed to use Open FOAM, such as
> FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
> dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
> LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.
> I've tried this installation on the Live DVD on my laptop with two
> cores, decomposed the case and ran the simulation in parallel without a
> I hope this information is more helpful.
> Best regards,
> users mailing list