Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

From: Manal Helal (manalorama_at_[hidden])
Date: 2006-11-09 23:34:05


Hi

I am trying to run the following command:

  mpirun -np XX -d xterm -e gdb <myprog> <myargs>

and I am receiving these errors:

*****************
  [leo01:02141] [0,0,0] setting up session dir with
[leo01:02141] universe default-universe
[leo01:02141] user mhelal
[leo01:02141] host leo01
[leo01:02141] jobid 0
[leo01:02141] procid 0
[leo01:02141] procdir:
/tmp/openmpi-sessions-mhelal_at_leo01_0/default-universe/0/0
[leo01:02141] jobdir:
/tmp/openmpi-sessions-mhelal_at_leo01_0/default-universe/0
[leo01:02141] unidir:
/tmp/openmpi-sessions-mhelal_at_leo01_0/default-universe
[leo01:02141] top: openmpi-sessions-mhelal_at_leo01_0
[leo01:02141] tmp: /tmp
[leo01:02141] [0,0,0] contact_file
/tmp/openmpi-sessions-mhelal_at_leo01_0/default- universe/universe-setup.txt
[leo01:02141] [0,0,0] wrote setup file
[leo01:02141] pls:rsh: local csh: 0, local bash: 1
[leo01:02141] pls:rsh: assuming same remote shell as local shell
[leo01:02141] pls:rsh: remote csh: 0, remote bash: 1
[leo01:02141] pls:rsh: final template argv:
[leo01:02141] pls:rsh: /usr/bin/ssh <template> orted --debug
--bootproxy 1 - -name <template> --num_procs 2 --vpid_start 0 --nodename
<template> --universe m helal_at_leo01:default-universe --nsreplica
"0.0.0;tcp://129.94.242.77:40738" --gpr replica
"0.0.0;tcp://129.94.242.77:40738" --mpi-call-yield 0
[leo01:02141] pls:rsh: launching on node localhost
[leo01:02141] pls:rsh: oversubscribed -- setting mpi_yield_when_idle to 1
(1 4)
[leo01:02141] pls:rsh: localhost is a LOCAL node
[leo01:02141] pls:rsh: changing to directory /import/eno/1/mhelal
[leo01:02141] pls:rsh: executing: orted --debug --bootproxy 1 --name 0.0.1
--num _procs 2 --vpid_start 0 --nodename localhost --universe
mhelal_at_leo01:default-uni verse --nsreplica
"0.0.0;tcp://129.94.242.77:40738" --gprreplica "0.0.0;tcp://12
9.94.242.77:40738" --mpi-call-yield 1
[leo01:02143] [0,0,1] setting up session dir with
[leo01:02143] universe default-universe
[leo01:02143] user mhelal
[leo01:02143] host localhost
[leo01:02143] jobid 0
[leo01:02143] procid 1
[leo01:02143] procdir:
/tmp/openmpi-sessions-mhelal_at_localhost_0/default-universe /0/1
[leo01:02143] jobdir:
/tmp/openmpi-sessions-mhelal_at_localhost_0/default-universe/ 0
[leo01:02143] unidir:
/tmp/openmpi-sessions-mhelal_at_localhost_0/default-universe
[leo01:02143] top: openmpi-sessions-mhelal_at_localhost_0
[leo01:02143] tmp: /tmp
[leo01:02143] sess_dir_finalize: proc session dir not empty - leaving
[leo01:02143] sess_dir_finalize: proc session dir not empty - leaving
[leo01:02143] sess_dir_finalize: proc session dir not empty - leaving
[leo01:02143] sess_dir_finalize: proc session dir not empty - leaving
[leo01:02143] orted: job_state_callback(jobid = 1, state =
ORTE_PROC_STATE_TERMI NATED)
[leo01:02143] sess_dir_finalize: job session dir not empty - leaving
[leo01:02143] sess_dir_finalize: found proc session dir empty - deleting
[leo01:02143] sess_dir_finalize: found job session dir empty - deleting
[leo01:02143] sess_dir_finalize: found univ session dir empty - deleting
[leo01:02143] sess_dir_finalize: found top session dir empty - deleting

****************

Will you please have a look, and advise if possible where I could change
these paths, when I checked the paths, it was not there all

Best Regards,

Manal