Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Open MPI User's mailing list

From: Mike Houston (mhouston_at_[hidden])
Date: 2005-10-31 10:46:44


We can't seem to run across TCP. We did a default 'configure'. Shared
memory seems to work, but trying tcp give us:

[0,1,1][btl_tcp_endpoint.c:557:mca_btl_tcp_endpoint_complete_connect]
connect() failed with errno=113

I'm assuming that the tcp backend is the most thoroughly tested, so I
thought I'd ask in case we are doing something silly. The above is
caused when running the OSU NBCL mpi_bandwidth test.

Thanks!

-Mike