Zephyr networking testing in LAVA, was: Re: Network forum agenda
Paul Sokolovsky
Hello,
On Mon, 6 Apr 2020 21:44:27 +0300 "Paul Sokolovsky via lists.zephyrproject.org" <paul.sokolovsky=linaro.org@lists.zephyrproject.org> wrote: [] If there is time, I'd like to share some progress on setting up CII appreciate being able to present my work quickly and the discussion of testing matters which followed. As it was just a quick spoken presentation, I'd like to share a few links showing more details, with the idea to keep wider community in loop of testing efforts around Zephyr. So, in this work Linaro LITE team uses the LAVA system (Linaro Automation and Validation Architecture), which is an open source project at https://www.lavasoftware.org/ (we run a particular deployment at https://lite.validation.linaro.org/). How it works is that we build Zephyr tests/samples in Jenkins (using the standard Zephyr "sanitycheck" tool), then submit binaries to LAVA, accompanied by a "test job definition", which is a YAML file like https://lite.validation.linaro.org/scheduler/job/960800/multinode_definition#defline1 . The job is then being run, with log of interaction recorded and analyzed for success/failure. In this case it's a networking test which involves 2 "nodes": a DUT (device under test) per se (FRDM-K64F board): https://lite.validation.linaro.org/scheduler/job/960800.0 and a docker container representing "a host": https://lite.validation.linaro.org/scheduler/job/960800.1#L56 . Here, the actual test interaction happens on the host, which starts with easy-pinging a device, then pings more with full Ethernet frames, then does a "poorman's flood ping" of pinging 1000 times with full packets and 10ms interval. All these actions are encoded in the YAML definition and are easily reconfigurable. LAVA checks that individual actions outcome satisfies success criteria and records overall results, e.g. https://lite.validation.linaro.org/results/960801/0_ping . The biggest value of such a system would come from early notifications of failures, and ability to compare results over time. The best ways to achieve that is so far under investigation (the whole work is largely a prototype at this stage). As discussed yesterday, we all by now should be aware that "Zephyr testing" bastion is being stormed by multiple stakeholders in different ways, and I just wanted to share Linaro's approach and progress with wider community. While the primary drivers for this works are requirements of our members interested in Zephyr, who already adopted the LAVA system, the work itself is open-source, results are public, and hopefully useful for a wider Zephyr community. (And different teams working on testing definitely should reuse results of each other's work, and further the best practices for making Zephyr more testable and quality-assured). Thanks, Paul Linaro.org | Open source software for ARM SoCs Follow Linaro: http://www.facebook.com/pages/Linaro http://twitter.com/#!/linaroorg - http://www.linaro.org/linaro-blog
|
|