Jump to: navigation, search

StarlingX/TestStrategy

< StarlingX
Revision as of 19:08, 12 December 2018 by Ada.cabrales (talk | contribs) (Next Steps)

Test Strategy

This is a draft version - please send your suggestions to ada.cabrales(at)intel.com

General

  • STX will have multiple releases per year
  • Testing is split in layers.
  • As different partners will be contributing features into STX, each one of them is responsible for testing the features they are contributing in order to maintain the stability of the product.
  • Due to the nature of open source projects, testing will be heavily reliant on automation.
  • Test assets will be contributed in a sub-project repository which will be accessible to the community (creation in progress)
  • The [framework] automation framework will be used (definition in progress).

Partners Responsibilities

feature contributor / developer

  • Create feature test plan and share it with the community for any feedback
  • Identify regression test cases for this feature in the test plan. These tests will become part of Feature Regression Test Suite
  • Fully test the feature
  • Execute system sanity before submitting the code
  • Contribute Unit tests related to this feature in Zuul
  • Create component level automated tests and contribute them in Zuul
  • Contribute automated feature tests in the test repository
  • Contribution of test cases will be part of feature acceptance criteria
  • Participation in Integration and Regression Testing
  • In addition to feature testing, each partner will be asked to take part in regression testing

testing team

  • Enabling of the testing platforms/frameworks to be used (Zuul, functional testing framework, etc)
  • Provide guidelines for test creation
  • Functional test automation
  • Collaboration with developers for feature test automation
  • Maintenance of the test management system and its content
  • Build the release plan
  • Test continuously the product and submit bugs

Layers description

Layer-1: Unit Testing

  • Unit testing for features will be conducted by the partner contributing the feature
  • Feature contributing partner is supposed to create unit tests for new feature and make these unit tests part of Zuul test
  • Zuul framework is currently available and Unit tests are running. There is a need to increase the coverage
  • Need to identify high risk areas that are lacking sufficient unit test coverage so that coverage can be improved

Layer-2: Component Testing

  • Component testing can be done by bringing up services in DevStack and running Internal API tests
  • It is the responsibility of partner contributing the feature to create these tests and make them part of Zuul test
  • It is the responsibility of the service owner to add coverage for the existing code
  • Zuul framework is available for running component level tests. Currently there is no coverage at this level
  • All tests should be visible and available to the community

Layer-3: Functional Testing (L – 3.0: Feature Testing)

  • Functional testing comprises of Feature testing, System Sanity, Regression and System Regression (Manual and Automated) testing
  • Each partner is required to create feature test plan for their feature and share it with the community for feedback. A template will be created and shared with community for writing feature test plan
  • It is necessary to identify core test cases for the feature that will become part of Regression test suite
  • All automatable test cases should be automated and contributed to Test Repository so that they are executed on a regular basis
  • Every feature should be thoroughly tested by the contributing partner

Layer-3: Functional Testing (L – 3.1: System Sanity)

  • System sanity consists of automated test cases that exercise core functionality
  • System Sanity test cases should be automated and made available to the community
  • It is the responsibility of each partner to make sure their changes are not breaking system sanity. They can download the system sanity suite and execute it locally
  • Shared Infrastructure is not currently available for running system sanity. As a next step, there is a need to create an infrastructure which will execute system sanity for every successful iso generated on CENGN
  • Generate a Docker image as part of build containing System Sanity test, it can contain all Robot and PyTest codes packaged and ready for execution. Input would be the target installed load address and group of test that needs execution, and it will produce a report of test results, ready for uploading to test result dashboard

Layer-3: Functional Testing (L – 3.2: Regression)

  • Regression consists of Manual and automated test cases identified from each feature for regression
  • Manual Regression suite will be shared with the community before the regression starts
  • Manual regression suite will be divided among partners for execution
  • Manual regression will be done on release branch build. Release branch will be created approximately one month before the release
  • Automated regression will keep running on a regular cadence

Defect Reporting

  • All defects will be reported using LaunchPad
  • Defects should be created using template proposed on StarlingX Wiki [1]
  • All defects will be screened and appropriate release tag will be assigned

Next Steps

Following are the next steps:

  • Present STX Test Strategy to the community for feedback and acceptance
  • Selection of Automation framework – Need to get community backing on one automation FW
  • Create a test repository for gathering test assets that can be shared with the community
  • Create automated sanity suite that can be executed by community members
  • Create regression test plan (L – 3.2) and share it with community
  • Create an environment which executes system sanity every time there is a successful iso creation on CENGN - Done
  • Generate a Docker image as part of build containing all automated regression test, it can contain all RobotFW and PyTest codes packaged and ready for execution. Input would be the target installed load address and group of test that needs execution, and it will produce a report of test results, ready for uploading to test result dashboard
  • Provide documentation for the FW and the APIs available
  • Investigate Tempest coverage for existing services and look a way for including STX services
  • What is the priority of Tempest work? Question for the TSC