Table of Contents |
---|
USE CASE
BBS Broadband Service Use Case (Dublin)
...
FUNCTIONAL & REGRESSION TESTING
Functional Testing - when new features & use cases are introduced that the existing use cases are enhanced to be compatible with the new functionality. For example, if enhancements were needed in the VCPE or VOLTE U/C to upgrade the PNF support introduced in R4 Dublin. ("Dictionary" Definition) Functional testing is a software testing process used within software development in which software is tested to ensure that it conforms with all requirements. Functional testing is a way of checking software to ensure that it has all the required functionality that's specified within its functional requirements.
Regression Testing - when new features & use cases are introduced doesn't break existing use cases. For example, if PNF onboarding/onboarding (PNF package) U/C introduced in R4 Dublin didn't break the functionality of VCPE or VOLTE. ("Dictionary" Definition) Regression testing is re-running functional and non-functional tests to ensure that previously developed and tested software still performs after a change. Changes that may require regression testing include bug fixes, software enhancements, configuration changes, and even substitution of electronic components. As regression test suites tend to grow with each found defect, test automation is frequently involved. Sometimes a change impact analysis is performed to determine an appropriate subset of tests (non-regression analysis).
Integration Testing - testing which brings together the platform components and new development to realize a use case. For example, testing centered around making sure that the SDC, A&AI, SO, VID, etc components work together to deliver the BBS Use Case. ONAP has a Integration Project, there is a dedicate team who is coordinating the integration efforts. Additional, each of the Use Cases are discussing have "roadmap" their integration work. ("Dictionary" Definition) Integration testing is a software testing methodology used to test individual software components or units of code to verify interaction between various software components and detect interface defects. Components are tested as a single group or organized in an iterative manner. After the integration testing has been performed on the components, they are readily available for system testing.
TESTING CONCERNS:
LIMITED RESOURCES - the Integration project (test team) in ONAP is a small team, which will depend on automation as much as possible to try to cover the Regression and Functional testing aspects as much as possible.
RELEASE RISK - there is obviously a risk adding new software without regression and functional testing, in that a lot of integration testing (from R3) was done to make sure that new U/C introduced in R3 worked properly, and now introducing new U/C, functionality, and requirements could potentially break that already tested & working software.
AUTOMATION - it is desirable to have as much automation as possible in testing so that tests can be re-run so that functionality can be re-checked in the current release when new functionality is introduced.
LAB ENVIRONMENT - U/C were developed in particular environments, the xNFs are installed in a particular environment, and these environments change over time. Limitations need to be identified with additional equipment or new equipment etc.
Typical Software Industry Testing Process. ONAP Development & Integration testing can consider and incorporate some of the key concepts with a "typical" software industry testing process to insure quality software is delivered for the Use Case development:
SUPPORTING DOCUMENTS
Document | File | ||||||
---|---|---|---|---|---|---|---|
M3 API Impacts Template (from Alla G.) |
| ||||||
SERVICE PROVIDER PRIORITIES
...