Summary

This specification describes changes to Launch Control during Ubuntu-N cycle that aim to improve display of test results.

Rationale

Current user interface is minimalistic. To make the tool useful and productive for the intended audience we need to develop specialized views, reports or charts/graphs that will extract and present key information relevant to the developers using the system. By building a generic reporting framework we can simplify and streamline the development of actual reports by providing common elements in one package. Doing so can enable third party developers to work on their reports without having to do development in the core/trunk.

Use Cases

  • A stakeholder in the Linaro infrastructure wants to use Launch Control to look at the progress of some specific endeavor. None of the existing capabilities are suitable so after going through either the stakeholder process or through the Linaro@UDS process they draft a new functionality that extends the base application with new reporting capability. The application is somewhat easier to implement and is integrated better with the core as it uses several generic elements of the reporting framework. It also gets several features for free such as programmatic API to any new data sources ("produced data") that it can generate.
  • A community member wants to use Launch Control for something totally unrelated to Linaro (such as Ubuntu QA process) and uses the core Launch Control application with the reporting framework to quickly write a custom solution that builds on various existing features (mostly data storage and transfer infrastructure).

Scope

All changes here will apply to the Launch Control project. Some changes may spill to several of the sub-projects that were spun off to separate projects/branches for better isolation and maintainability.

Design

The website would get a new section "Reports" as visible on the mock-up below. This section would serve as a hub for both browsing available reports as well as creating new ones. Each report would be a dynamic view that generates some sort of chart/table/document based on the (stored) configuration and available data. Adding new data to the system might change existing reports. Storing a "snapshot" of a report would be an optional feature. Initially it would be best to simply allow the user to save the image/document locally. In the future we might consider storing snapshots inside the system (as test data cannot be altered, just appended and we store import timestamp it would be simple to add global time constraints to generate the report as it appeared at a particular time).

  • reports-view-small.jpg

The sidebar would list available stored reports grouped by sharing type. The following values would be supported:

  • Private reports. Only the author can access those
  • Team reports. There would be one section per team that the user is a member of. Teams that don't have any reports would not be listed
  • Public reports. This would be the only section that would show up if the user was not signed in.

Clicking any element in the sidebar would take the user directly to that report (the URL would be constant so it can be easily shared with other users). User should also be albe to remove reports directly from that page. Each report can be removed by the report owner or by team member (for team reports).The content area would have a simple wizard (perhaps the first out of several pages/steps) to create new report. A detailed mockup for the wizard is shown below.

  • new-report-wizard-initial-page.jpg

The first page is kept as simple as possible, subsequent pages will feature more specialized views (adding more detailed configuration options) and previews (if possible technically).

The first page would allow the user to select just two pieces of data:

  • Type of report to make. Initially it would just be benchmark that looks at measurement values of various test results from a particular test case (more about that later). The name should be either very intuitive or be accompanied by a caption explaining what the report does.
  • Test to select data from. Initially a simple combo box, might be upgraded to AJAX-based autocomplete text field if we start to get lots of different tests.

The continue button would take the user to the dedicated wizard which is discussed below.

In general each subsequent step/page would conform to this template:

  • new-report-wizard-steps.jpg

Each wizard page would have three major parts:

  • The top part contains a list of steps the user has already completed. It should be possible to click on any of them to adjust values but this might be difficult if the change would alter subsequent steps heavily. This is something that we might not implement initially or could be implemented as a simple always-discard policy (with an appopriate notification that the user must confirm).
  • The central part contains widgets/forms/manipulators relevant to the current step. This is mostly free-form but we might have some common theme if it makes sense. In particular the possition of the preview (if any) and the continue buttons should be consistent with other steps/reports).
  • The bottom part contains future steps so that the user is more aware of how long the process is likely to take.

Implementation

Implementation should start as a branch to launch-control. Additional features that need to alter the core should be carefully managed, rest of the application should be developed in a new application (separate directory still tracked in the same branch for simplicity). Before inclusion the app should be separated to a standalone branch (hopefully with bzr-split) and the changes to core should land in trunk.

Each new report class would be a dedicated application. This could allow administrators to deploy the apps their users care about and most importantly allow unrelated developers to contribute to the core system by writing vertical solutions based on a common platform APIs. Each application would need to conform to some standard entry point/registration mechanics so that it gets picked up by the dashboard application.

Models

models.jpg

The following models are needed:

  • model for storing saved reports called SavedReport (configuration document, access rights (owner/group), name, etc)

  • model for storing registered report apps and their configuration schema called ReportClass (schema for configuration document, name, python entry point)

  • model for storing registered data sources (report class (app name), name, python entry point)

Saved Reports

Each stored report (configuration parameters for generating a report against live data) would be a JSON document that is stored in the system. This would prevent us from having to alter the model to add new reports that don't need new database schema or some elaborate intermediate data format that is expensive to compute. Each report class would contain a schema that the configuration document must conform to.

Views

The following views are needed:

  • view implementing the sidebar with saved reports (taking into account user identity and group membership)
  • views and forms implementing new report wizard mechanics (for the main page/new report)
  • views implementing the data source adaptation layers: (for reuse in apps)
    • view implementing AJAX access to any data source and configuration document
    • view implementing comma-separated-values access to any data source and configuration document
    • view implementing HTML-table with data source and configuration document

The following XML-RPC extensions are needed:

  • set of XML-RPC methods (exported with linaro-django-xmlrpc) implementing data access to any data source and configuration document (for scripting)

Exporting data

Each report application must register data sources. Data sources would be generalized methods that take a configuration object and return a collection of results. All of that data would then be exported by the main reporting application as either HTML table, Chart (based on the app-specific view and AJAX access to the data source) and XML-RPC (for scripting). Details need to be specified later.

Rendering Charts

Unless there is a reason to change this each "chart" should be built with server-side data (AJAX request to one of the exported data sources) and client side renderer (Open Flash Chart 2: http://teethgrinder.co.uk/open-flash-chart/)

Interfaces for sub-applications

class IReportEntryPoint(object):
   """
   Entry point for calling into third party report application
   """

   def render_saved_report(self, request, configuration_object, data_since=None, data_to=None):
       """
       Ask the application to render a saved report using the specified configuration object.
       Optionally limit the considered data using the specified timestamps.

       The wizard should use the proper base template and redefine appropriate blocks to retain the common UI theme.

       Returns a HTTP response.
       """

   def render_new_report_wizard(self, request):
       """
       Ask the application to render a new saved report wizard.
       
       The wizard should use the proper base template and redefine appropriate blocks to retain the common UI theme.

       Returns a HTTP response.
       """

  
class IDataSourceEntryPoint(object):
   """
   Entry point for calling into the third party data sources
   """

   def prepare(self, configuration_object, data_since=None, data_to=None):
       """
       Prepare for working with the specified configuration object.
       Optionally limit the considered data using the specified timestamps.
       
       Returns new DataSource instance initialized with those parameters.
       """


class IDataSource(object):
    """
    Interface for talking to instantiated data source.
    A data source is a collection of data series, each being a tuple of 

   def get_data_series(self):
       """
       Get a list of data series (name)
       """

   def get_series_data(self, name):
       """
       Return all data for the specified data series as an iterable of tuples
       """

CategorySpec .

internal/archive/Platform/Infrastructure/Specs/TestResultDisplayInLaunchControl (last modified 2013-08-23 02:02:56)