Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] mpirun/exec requires ssh?
From: Ralph Castain (rhc_at_[hidden])
Date: 2009-03-24 19:50:06


For those of you in this situation, you can apply the attached patch
to your OMPI 1.3.1 source code and rebuild it - has been tested by the
original reporter and it solved this particular problem.

Ralph


On Mar 24, 2009, at 5:44 PM, Divya Navaneetha Krishna wrote:

> Hi,
>
> Ran into the same problem yesterday and was wondering what was wrong.
> Couldn't get that fixed even after I read the users archives. Thanks
> for this
> response. So, there's no way as of now to get openmpi running on a
> laptop with ubuntu?
>
> Thanks,
> Divya
>
> On Tue, Mar 24, 2009 at 7:33 PM, Ralph Castain <rhc_at_[hidden]> wrote:
>> Yeah, there is something funny about the way Ubuntu is defining their
>> Ethernet interfaces that is causing a problem. I have a patch that
>> will be
>> in 1.3.2 that fixes the problem.
>>
>>
>> On Mar 24, 2009, at 5:24 PM, Simone Pellegrini wrote:
>>
>>> Hello everyone,
>>> I have the same problem when I try to install openmpi 1.3.1 on my
>>> laptop
>>> (Ubuntu 8.10 running on a dual core machine).
>>>
>>> I did the same installation on Ubuntu 8.04 and everything works,
>>> but here
>>> no matter what I do, every time I type mpirun the system prompt
>>> for the
>>> password.
>>>
>>> Actually, when I install openmpi as a super user and I try to run
>>> mpirun
>>> (or mpicc) I get the following errors:
>>> @eNerd:~$ mpicc
>>> Cannot open configuration file /usr/share/openmpi/mpicc-wrapper-
>>> data.txt
>>> Error parsing data file mpicc: Not found
>>> @eNerd:~$ mpirun --np 2 ls
>>> mpirun: symbol lookup error: mpirun: undefined symbol:
>>> orted_cmd_line
>>>
>>> mpiexec on the other hand is working but asking for a password:
>>>
>>> @eNerd:~/Desktop/openmpi-1.3.1$ mpiexec --np 2 ls
>>> @enerd's password:
>>> acinclude.m4 config.log Doxyfile Makefile ompi
>>> VERSION
>>> aclocal.m4 config.status examples Makefile.am opal
>>> ...
>>>
>>> cheers, Simone
>>>
>>> Ralph Castain wrote:
>>>>
>>>> One thing you might want to try is blowing away that prefix dir and
>>>> reinstalling OMPI 1.3.1. I'm not confident that "make uninstall"
>>>> does an
>>>> adequate job of cleaning things out. The problem is that there
>>>> are major
>>>> differences between 1.2.x and 1.3.x, and the uninstall may well
>>>> miss some
>>>> things as a result.
>>>>
>>>> Easy place to start, at least. ;-)
>>>>
>>>>
>>>> On Mar 23, 2009, at 11:51 AM, Olaf Lenz wrote:
>>>>
>>>>> Hi!
>>>>>
>>>>> Ralph Castain wrote:
>>>>>>
>>>>>> I regularly run jobs like that on 1.3.1 - it has no desire to
>>>>>> use ssh
>>>>>> to start anything. On a local host such as this command uses,
>>>>>> all mpiexec
>>>>>> does is fork/exec the procs.
>>>>>
>>>>> That sounds strange. I'm just going back and forth between
>>>>> OpenMPI 1.2.9
>>>>> and OpenMPI 1.3.1 by using make uninstall/make install, and I
>>>>> can always
>>>>> reproduce the behavior.
>>>>>
>>>>>> It sounds like something strange is going on in your
>>>>>> environment that
>>>>>> makes OMPI think it is launching on a remote host. Most likely
>>>>>> cause is
>>>>>> something in your Ethernet configuration. Can you send us the
>>>>>> output of
>>>>>> ifconfig (or whatever your equivalent is)?
>>>>>
>>>>> Ok, here is some information on my system:
>>>>> * Kubuntu 9.04 (Jaunty) alpha 6
>>>>> * Core Duo CPU
>>>>> * I have compiled both OpenMPI versions (1.2.9 and 1.3.1)
>>>>> myself, using
>>>>>
>>>>> configure --prefix=$HOME/software --enable-shared --enable-static
>>>>>
>>>>> Output of some ifconfig:
>>>>>> ifconfig
>>>>> eth0 Link encap:Ethernet HWaddr 00:1e:37:15:1b:70
>>>>> inet addr:192.168.1.1 Bcast:192.168.1.255 Mask:
>>>>> 255.255.255.0
>>>>> inet6 addr: fe80::21e:37ff:fe15:1b70/64 Scope:Link
>>>>> UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
>>>>> RX packets:34185 errors:0 dropped:0 overruns:0 frame:0
>>>>> TX packets:29386 errors:0 dropped:0 overruns:0 carrier:0
>>>>> collisions:0 txqueuelen:100
>>>>> RX bytes:25645492 (25.6 MB) TX bytes:3921545 (3.9 MB)
>>>>> Memory:fe000000-fe020000
>>>>>
>>>>> lo Link encap:Local Loopback
>>>>> inet addr:127.0.0.1 Mask:255.0.0.0
>>>>> inet6 addr: ::1/128 Scope:Host
>>>>> UP LOOPBACK RUNNING MTU:16436 Metric:1
>>>>> RX packets:5372 errors:0 dropped:0 overruns:0 frame:0
>>>>> TX packets:5372 errors:0 dropped:0 overruns:0 carrier:0
>>>>> collisions:0 txqueuelen:0
>>>>> RX bytes:661715 (661.7 KB) TX bytes:661715 (661.7 KB)
>>>>>
>>>>> tun0 Link encap:UNSPEC HWaddr
>>>>> 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00
>>>>> inet addr:10.20.143.6 P-t-P:10.20.143.5 Mask:
>>>>> 255.255.255.255
>>>>> UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1500 Metric:1
>>>>> RX packets:2234 errors:0 dropped:0 overruns:0 frame:0
>>>>> TX packets:3158 errors:0 dropped:0 overruns:0 carrier:0
>>>>> collisions:0 txqueuelen:100
>>>>> RX bytes:940549 (940.5 KB) TX bytes:207091 (207.0 KB)
>>>>>
>>>>> You need anything else?
>>>>>
>>>>>
>>>>> Olaf
>>>>> _______________________________________________
>>>>> users mailing list
>>>>> users_at_[hidden]
>>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>
>>>> _______________________________________________
>>>> users mailing list
>>>> users_at_[hidden]
>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>
>>> _______________________________________________
>>> users mailing list
>>> users_at_[hidden]
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users