Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

Subject: Re: [OMPI users] MPI_Comm_accept()/connect() errors
From: Blesson Varghese (b.varghese_at_[hidden])
Date: 2009-10-08 05:33:45


The PATH variable contains
/home/hx019035/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/
bin:/usr/games:/usr/local/maui/bin/:

/home/hx019035/bin contains the local installation of OMPI 1.3.3

 

The LD_LIBRARY_PATH variable contains /home/hx019035/lib:

 

These variables are being set in the .profile file on the hpcc00 node.

 

Would there be a change anywhere else?

 

 

From: Ralph Castain [mailto:rhc.openmpi_at_[hidden]] On Behalf Of Ralph
Castain
Sent: 07 October 2009 13:32
To: Blesson Varghese
Subject: Re: [OMPI users] MPI_Comm_accept()/connect() errors

 

Yes, it does. But the error message indicates a 1.2 version is running on
hpcc00.

 

On Oct 7, 2009, at 5:46 AM, Blesson Varghese wrote:

 

Just a quick question. Would mpirun -version give me the version of the
mpirun being executed? I am getting the result of that as 1.3.3.

 

From: Ralph Castain [mailto:rhc.openmpi_at_[hidden]] On Behalf Of Ralph
Castain
Sent: 07 October 2009 11:58
To: Blesson Varghese
Subject: Re: [OMPI users] MPI_Comm_accept()/connect() errors

 

Hate to tell you this, but your output clearly indicates you are NOT running
1.3.3 - that is an output from a 1.2.x version of OMPI.

 

Check you path and ld_library_path - you're still picking up the 1.2.5
version somewhere.

 

 

On Oct 7, 2009, at 4:05 AM, Blesson Varghese wrote:

Hi,

 

Please refer to the emails below.

 

I have made an upgrade to Open MPI 1.3.3 as suggested. The necessary
environment variables have all been set. Attaching the output of ompi_info
-all. However, the errors continue to persist.

 

[hpcc00:31864] [0,0,0] ORTE_ERROR_LOG: Not found in file dss/dss_unpack.c at
line 209

[hpcc00:31864] [0,0,0] ORTE_ERROR_LOG: Not found in file
communicator/comm_dyn.c at line 186

[hpcc00:31864] *** An error occurred in MPI_Comm_connect

[hpcc00:31864] *** on communicator MPI_COMM_WORLD

[hpcc00:31864] *** MPI_ERR_INTERN: internal error

[hpcc00:31864] *** MPI_ERRORS_ARE_FATAL (goodbye)

 

 

The server program is as follows:

 

#include <mpi.h>

#include <stdio.h>

#include <stdlib.h>

 

int main( int argc, char **argv )

{

      MPI_Comm client;

      MPI_Status status;

      char port_name[MPI_MAX_PORT_NAME];

      int buf;

      int size, again;

      MPI_Info portInfo;

 

      MPI_Init( &argc, &argv );

 

      MPI_Comm_size(MPI_COMM_WORLD, &size);

   

      MPI_Open_port(MPI_INFO_NULL, port_name);

 

      printf("server available at %s\n",port_name);

     

      MPI_Comm_accept(port_name, MPI_INFO_NULL, 0, MPI_COMM_WORLD, &client
);

      MPI_Recv(&buf, 1, MPI_INT, MPI_ANY_SOURCE, MPI_ANY_TAG, client,
&status );

      MPI_Comm_disconnect( &client );

}

 

The client program is as follows:

 

#include <mpi.h>

#include <stdlib.h>

#include <stdio.h>

#include <string.h>

 

int main( int argc, char **argv )

{

    MPI_Comm server;

    int buf = 8;

    char port_name[MPI_MAX_PORT_NAME];

    MPI_Info portInfo;

 

    MPI_Init( &argc, &argv );

 

    strcpy(port_name, "0.0.0:2000"); //The port name is hardcoded since
0.0.0:2000 is generated by the server program

    MPI_Comm_connect(port_name, MPI_INFO_NULL, 0, MPI_COMM_WORLD, &server );

 

    MPI_Send(&buf, 1, MPI_INT, 0, 1, server );

   

    MPI_Comm_disconnect( &server );

    MPI_Finalize();

    return 0;

}

 

Would you please advise?

 

Regards,

Blesson.

 

 

-----Original Message-----
From: Blesson Varghese [mailto:hx019035_at_[hidden]]
Sent: 03 October 2009 12:20
To: 'Jeff Squyres'
Subject: RE: [OMPI users] MPI_Comm_accept()/connect() errors

 

Thank you. I shall try the upgrade very soon.

 

-----Original Message-----

From: Jeff Squyres [mailto:jsquyres_at_[hidden]]

Sent: 03 October 2009 12:18

To: Blesson Varghese

Subject: Re: [OMPI users] MPI_Comm_accept()/connect() errors

 

On Oct 3, 2009, at 7:14 AM, Blesson Varghese wrote:

 

> Thanks for your reply Jeff. Since, it is a teaching cluster of the

> University, I am quite unsure if I would be able to upgrade it very

> soon.

>

> Do you reckon that the error is due to the Open MPI version?

>

 

You can always install your own version of Open MPI under your $HOME

or somesuch -- there is no requirement that Open MPI is installed by

root in a central location.

 

That being said, you might want to check with your administrator to

ensure that this is ok with local policies -- see if they did any

special setup for Open MPI, etc.

 

But yes, we made a bunch of COMM_SPAWN improvements since the 1.2

series.

 

--
Jeff Squyres
jsquyres_at_[hidden]
 
 
From: Blesson Varghese [mailto:hx019035_at_[hidden]] 
Sent: 01 October 2009 12:01
To: 'Open MPI Users'; 'Ralph Castain'
Subject: RE: [OMPI users] MPI_Comm_accept()/connect() errors
 
The following is the information regarding the error. I am running Open MPI
1.2.5 on Ubuntu 4.2.4, kernel version 2.6.24
 
I ran the server program as mpirun -np 1 server. This program gave me the
output port as 0.1.0:2000. I used this port name value as the command line
argument for the client program: mpirun -np 1 client 0.1.1:2000.
 
- The output of the "ompi_info --all" is attached with the email
- PATH Variable:
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr
/local/maui/bin/:
- LD_LIBRARY_PATH variable was empty
- The following is the output of ifconfig on hpcc00 from where the error has
been generated:
eth0      Link encap:Ethernet  HWaddr 00:12:3f:4c:2d:78
          inet addr:134.225.200.100  Bcast:134.225.200.255
Mask:255.255.255.0
          inet6 addr: fe80::212:3fff:fe4c:2d78/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:15912728 errors:0 dropped:0 overruns:0 frame:0
          TX packets:15312376 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000
          RX bytes:2951880321 (2.7 GB)  TX bytes:2788249498 (2.5 GB)
          Interrupt:16
 
lo        Link encap:Local Loopback
          inet addr:127.0.0.1  Mask:255.0.0.0
          inet6 addr: ::1/128 Scope:Host
          UP LOOPBACK RUNNING  MTU:16436  Metric:1
          RX packets:3507489 errors:0 dropped:0 overruns:0 frame:0
          TX packets:3507489 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:0
          RX bytes:1794266658 (1.6 GB)  TX bytes:1794266658 (1.6 GB)
 
Regards,
Blesson.
 
From: users-bounces_at_[hidden] [mailto:users-bounces_at_[hidden]] On
Behalf Of Ralph Castain
Sent: 29 September 2009 23:59
To: Open MPI Users
Subject: Re: [OMPI users] MPI_Comm_accept()/connect() errors
 
I will ask the obvious - what version of Open MPI are you running? In what
environment? What was your command line?
 
:-)
 
On Sep 29, 2009, at 3:50 PM, Blesson Varghese wrote:
 
Hi,
 
I have been trying to execute the server.c and client.c program provided in
http://www.mpi-forum.org/docs/mpi21-report/node213.htm#Node213, using
accept() and connect() function in MPI. However, the following errors are
generated.
 
[hpcc00:16522] *** An error occurred in MPI_Comm_connect
[hpcc00:16522] *** on communicator MPI_COMM_WORLD
[hpcc00:16522] *** MPI_ERR_INTERN: internal error
[hpcc00:16522] *** MPI_ERRORS_ARE_FATAL (goodbye)
 
Could anybody please help me?
 
Many thanks,
Blesson.
_______________________________________________
users mailing list
users_at_[hidden]
http://www.open-mpi.org/mailman/listinfo.cgi/users
 
<outputompi.txt>