Long term plans about validation in Linaro, the purpose of LAVA, the validation of the new features Linaro works on etc. The aim here is to:
- find what Linaro should concentrate on (versus our members versus OEMs)
- Understand what the LAVA 'product' is, what is it's scope, license, can members add tests to it etc
- Understand what the process is for testing in Linaro (includes manual tests and LAVA automated tests)
- where do we focus our validation efforts? what type of validation do we do?
- what type of distribution for LAVA? what's the LAVA product?
- how do we gather ideas and requirements from each member? which similar products should we look at?
Orienting our validation efforts
There's a desire from some members to associate a test plan / a set of tests / a testing report with the delivery of new work. We also need to think about the format of delivery of tests or test results. What should we do:
- do we release test cases?
- do we produce some kind of report of what was tested and how it was tested,
- in which format?
- examples from members?
Types and amount of validation
Not every card needs the same amount of validation; the testing of each card / amount of validation could be part of the debate when discussing and approving new cards. It's very much on a case by case basis. Questions:
- How much manual/human testing should we do, and do we report on it?
- Obviously automated tests are a nicer thing, but they take longer to develop and aren't necessarily worth it for every single new thing we're doing; sometimes the developer just boots his kernel and tests the feature manually and that might well be good enough for small developments.
What to validate?
- What do members do? We should avoid duplicating existing effort.
- Focus on some key areas, suggest:
- big.LITTLE validation
- power management (deliverable results)
- Need to consider hard tests aimed at breaking systems, for example, automatically connecting/disconnecting from a wifi 1000 times, suspend/resume in a loop until it breaks etc.
- Are the boards Linaro work on representative enough of real products that they could be used for this? The wifi chip shipped in a phone might not be the same one that is selected for a development board. It's also questionable that they provide adequate performance characteristics for load testing, benchmarks and performance measurement, or for power management measures.
LAVA as a product
So far in LAVA, we have concentrated on continuous integration with tests running for each cycle brings a lot of value to centralizing some classes of tests in LAVA: it allows identifying regressions quickly, or see improvements over time. However, there are some product questions that need answering:
- Should think about how we allow members to test these features within their SoC bring up / Landing Teams
- Should tests could be exported into another test environment?
- Should members be able to add their tests into LAVA, at least in their Landing Teams?
- Should LAVA continue to be open source and run by Linaro as a service to members?
- Should LAVA be a separately installable 'product'?
- Do members expect to integrate it into their environment?
- What about ODMs/OEMs/ISVs/IHVs?
- Are we trying to create our own or join in with other projects.
- Is it worthwhile to research what concurrent products are doing
- Windriver has its own test farm, what features do they have that we don't?
- What kind of tests do they run that we don't?
- What do they plan to add?
- What kind of requirements do Linaro members have internally for testing of their own developments?
- Are there specific standards that they comply with?
- Is it worthwhile to organise one to one meetings with each member's validation team(s)
NOTE Keep in mind that a risk for Linaro would be to develop less because we try to set the bar too high in the quality/standards-compliance of our tests.
TSC/2012-02-22/ValidationFutures (last modified 2012-02-22 11:31:34)