Jeff Squyres (jsquyres) wrote On 07/10/06 17:48,:
>>From: Joachim Worringen [mailto:joachim_at_[hidden]]
>>Sent: Monday, July 10, 2006 7:41 AM
>>Subject: Re: [Fwd: [perfbase-users] Submitting a run in
>>The current version of perfbase does no longer create a new table for
>>each run, but only a new index. Indexes are cheap, but queries might
>>become somewhat slower if a large list of run indexes has to be
>>processed. Note that this really reads *might*, as I don't
>>know how well
>>PostgreSQL handles/optimizes such SQL constructs.
> Good to know.
>>It's an important characteristic of perfbase that runs can not be
>>modified once they have been created. Thus, the thing you thought of
>>won't be possible.
>>However, it is possible to create a single run from an
>>of input files by using the "--join" option for the input command. I
>>guess that's the way to go. Additionally, if you can not
>>avoid to create
>>multiple perfbase runs for a single test suite run, you should group
>>these run via a dedicated parameter value, or use the
>>synopsis for this.
> I think the latter is what we'll likely do; our goal is to see partial
> results (e.g., not have to wait 10 hours to see that every single test
> failed) rather than be able to submit lots of results at once.
To see partial results, I have this in my .ini file:
[Reporter: IU database]
module = Perfbase
perfbase_realm = OMPI
perfbase_username = postgres
... [snip] ...
perfbase_debug_filename = pb_debug
I can then look at the pb_debug* files to see the results as they're happening. They're not pretty
results in tabular or graphical format, but could these raw results suffice for most situations?
E.g., you can get a pretty good idea of how the tests are going by just doing:
$ grep -E 'test_pass|test_name' pb_debug*
... [snip] ...
I think though, that the Perbase.pm code needs to be adjusted to allow for results to go to both the
perfbase debug files _and_ to IU's perfbase simultaneously (right now there's an if-else preventing