Summary

Create simple reports and graphs of data stored in the dashboard that can be used by a wide range of tests

Rationale

For many tests, the results are fairly generic and predictable. For example, many performance tests store a single measurement for a static set of test cases, and many qualitative tests store a single pass/fail result for their test cases. We should make it easy for the user to produce a report that will show a comparison over all the results in a stream.

User stories

  • Andy introduced some new kernel patches and would like to see if this introduced any LTP regressions. He selects a simple qualitative comparison result type, provides the two result IDs in the dashboard, and a chart is generated showing all the LTP results of those two test runs with a total number of passes/fails.
  • Rick would like to see the fluctuation of test failures over all the results in a test stream. He selects the report type, result stream and selects to see the totals rather than an individual test id. A graph is generated that shows the total failures of all test ids in each test result in the entire stream, sorted by the time/date stamp when it was run.
  • Mike would like to see how his performance test is doing over the last month. He selects the report type, his stream, and test id, and provides the starting and ending date stamp to restrict the results to that range of dates. A graph is generated showing the measurement value of all results of that test_id, in the specified date range.

Assumptions

  • Results of a single testsuite/machine should be stored in it's own stream.
  • Results do not store any arbitrary data outside the usual result/measurement fields that would need to be handled on a per test suite basis.

Design

You can have subsections that better describe specific parts of the issue.

Implementation

This section should describe a plan of action (the "how") to implement the changes discussed. Could include subsections like:

UI Changes

Should cover changes required to the UI, or specific UI that is required to implement this

Code Changes

Code changes should include an overview of what needs to change, and in some cases even the specific details.

Migration

Include:

  • data migration, if any
  • redirects from old URLs to new ones, if any
  • how users will be pointed to the new way of doing things, if necessary.

Test/Demo Plan

It's important that we are able to test new features, and demonstrate them to users. Use this section to describe a short plan that anybody can follow that demonstrates the feature is working. This can then be used during testing, and to show off after release. Please add an entry to http://testcases.qa.ubuntu.com/Coverage/NewFeatures for tracking test coverage.

This need not be added or completed until the specification is nearing beta.

Unresolved issues

This should highlight any issues that should be addressed in further specifications, and not problems with the specification itself; since any specification with problems cannot be approved.

BoF agenda and discussion

Use this section to take notes during the BoF; if you keep it in the approved spec, use it for summarising what was discussed and note any options that were rejected.


CategorySpec CategoryTemplate

Platform/Validation/Specs/BasicReports (last modified 2011-03-30 18:52:04)