Subject: Re: [MTT users] MTT fail to require MTT::Test::Specify::Simple
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2008-05-16 10:39:31


I don't see these results in the OMPI MTT database at all. The only
results I see from IBM in the past 7 days are from platform ibm_ia32
and some "undef" platform:

     http://www.open-mpi.org/mtt/index.php?do_redir=656

Are you sure that you submitted them?

On May 15, 2008, at 2:02 PM, Wen Hao Wang wrote:

> Hello Jeff:
>
> I just finished rerunning mtt without --trial option for your
> checking. I suppose you can get the reports in database. If that's
> not the case, please correct me. My platform name is
> ibm_beijing_rhel5.2.
>
> My server can not be accessed outside from IBM. If you need any
> inform to check those errors, for example, those txt files of
> containing errors, please let me know. I will send them to your
> mailbox.
>
> Thanks a lot!
>
> Wen Hao Wang
> <graycol.gif>Jeff Squyres <jsquyres_at_[hidden]>
>
>
> Jeff Squyres <jsquyres_at_[hidden]>
> Sent by: mtt-users-bounces_at_[hidden]
> 2008-05-15 19:05
> Please respond to
> General user list for the MPI Testing Tool <mtt-users_at_[hidden]>
> <ecblank.gif>
> To
> <ecblank.gif>
> General user list for the MPI Testing Tool <mtt-users_at_[hidden]>
> <ecblank.gif>
> cc
> <ecblank.gif>
> Brad Benton <brad.benton_at_[hidden]>, mtt-users-bounces_at_[hidden]
> <ecblank.gif>
> Subject
> <ecblank.gif>
> Re: [MTT users] MTT fail to require MTT::Test::Specify::Simple
> <ecblank.gif>
> <ecblank.gif>
>
> On May 15, 2008, at 1:56 AM, Wen Hao Wang wrote:
>
> > Sorry for my delay. I deleted that line "specify_module = Simple" in
> > ini file, and the require failure diappeared. Thanks for your
> support.
> >
>
> Great! I just filed https://svn.open-mpi.org/trac/mtt/ticket/356
> because it seems like this is a simple enough error that we should
> really be able to detect it pretty easily and print out a reasonable
> error message.
>
> > Here is my MTT arguments this time:
> > client/mtt -d --force -f samples/ompi-core-template.ini --trial --
> no-
> > section intel --no-section 1.1
> > The MTT result contains totally 22 failure. Not sure which of them
> > are already known issues. I suppose the failed cases are one open
> > MPI issue, instead of MTT issue. If that's the case, please correct
> > me. I do not know which bug in https://svn.open-mpi.org/trac/ompi/report/6
> > matches my mtt failure in running "mpirun -np 2 --mca btl tcp,self
> > --prefix /LTC/MTT/..." listed below.
> >
> > Is there any method to probe this failures for me? Or is there
> > anyone to check all my MTT failed cases? I need one method to get
> > which failed cases have been known and tracked.
> >
>
> Unfortunately we do not have such an automated mechanism -- most of
> the existing known issues/failures are just known by the developers.
> That being said, if anyone has any clue how to implement such an
> automated system, I'm all ears. We just could never figure out how to
> do it reliably because the stdout/stderr of known issue/failure X may
> be slightly different on any given machine. :-\
>
> If you want to do a run in trial mode and send your results to the
> main OMPI MTT DB, if you could send us a permalink for your results,
> I'd be happy to look at them and help you classify the errors.
>
> --
> Jeff Squyres
> Cisco Systems
>
> _______________________________________________
> mtt-users mailing list
> mtt-users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/mtt-users
>
> _______________________________________________
> mtt-users mailing list
> mtt-users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/mtt-users

-- 
Jeff Squyres
Cisco Systems