RefStack/UseCases
Use Cases are a powerful way of identifying requirements for a software project. In the initial phase of RefStack definition, these cases have been used to drive design and architecture. They are also currently used to drive development milestones. indicates first pass feature..
Contents
ReStack Use Cases
Ops: A Private Cloud Operator
I want to:
- preview results before they are public
- compare my test runs over time
- compare my test runs against clouds with similar characteristics (tagging)
- be able to ignore tests that are not important
- have tests grouped into capabilities so I am not overwhelmed
- highlight tests that are critical for operations
- be able to upload my tempest results to refstack for analysis
- be able to remove results
User: An OpenStack User
I want to:
- make sure that the capabilities I require are available
- op 3, 4, 5 and 6
- be able to post results from my own tests
- (foundation 7)
Prospect: a Prospective OpenStack User
I want to:
- compare results without uploading tests
- not be forced to create a lp account to get data
- not see results that are not accurate (approved results only)
Vendor / Organization: an OpenStack Vendor
I want to:
- control tags that can be applied to test run against my cloud
- contest results that I disagree with
- be able to mark a result as official
- know that test submission are coming from validated users
- be able to excludes tests for capabilities we have not implemented (ops 4)
- be able to give customers information about my compatibility (ops 3)
- results that I don’t control to be statistically relevant
- expose aggregate results to prospects (if they are favorable)
Foundation: the OpenStack Foundation
I need/want:
- way to determine which tests are passing in my install base
- way for community to indicate important capabilities
- way to communicate which tests are “must pass” on a version by version basis
- way to certify that a result includes all the “must pass” tests
- to indicate organizations that are in compliance
- way to settle disputes (ops 3)
- to highlight gaps in capability coverage
- to ensure that results posted are accurate
- link between RefStack result and documentation at the appropriate version
Vendor: Relies on OpenStack API’s
I need to:
- be able to validate which clouds my product will work with (user 1)
- be able to upload test results that are outside the upstream
Requirements Map
Use Case | OpenStack Foundation | Cloud Vendor | Component Vendor | Cloud Operator | User | Prospective User |
---|---|---|---|---|---|---|
Manage Test Runs | ||||||
Identify/select test sets by capability | ||||||
Highlight/select Critical Operational capabilities test sets | ||||||
Run tests against cloud | ||||||
Run selected subset of tests | ||||||
View set of previously run tests when selecting for new run | ||||||
Manage Test Results | ||||||
Preview test results (before publishing) | ||||||
Compare test results over time | ||||||
Upload test results to RefStack for analysis | ||||||
Publish test results to RefStack | ||||||
Remove test results from RefStack | ||||||
Compare my results against clouds with similar characteristics | ||||||