Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »


We would like to automate as many use cases as possible.
it means we would like to move from a declarative wiki status page corresponging to a use case run on one lab to a gate where the different use cases could be executed on any lab


today in the "gates" we have already some test suites
these test suites are systematically replayed every day on "Daily" chains and on gating chain (on patchset submission)


E.g. El Alto daily Gate

The testsuites usually include several tests(e.g. onap_vnf include basic_ubunutu, freearius_nbi and clearwater_vims, security includes 6 tests)

To be integrated in CI, you must integrate your tests in of of the defined test categories (see Integration categories & testsuites discussions):

  • infrastructure-healthcheck
  • healthcheck
  • smoke-usecases
  • candidate-usecases
  • security
  • benchmark

Your tests must be fully automated...

Your tests must be fully automated. It means that it shall take into account the env on which it is executed, setup the resources, run the tests, delete the resources created for the test
if you are using non open source VNF, proprietary third components/equipments you will not be able to fully automate.

Usually people start by automating partially the use cases, it means they develop scripts but do not always consider the setup/teardown phase (it should not be linked to the installation) and or require some manual steps preventing from a full automation.

You can develop your test using any language (python, bash, robotframework, go..) and any framework.

...and integrated in a docker based on Xtesting

However your test suites must be "xtestingized", it means they must be embedded in a docker file leveraging Xtesting.

Xtesting is a light framework aiming to harmonize test inputs and outputs, which is very helpful for integration.It has been developped in OPNFV and already proposed several infrastructure test dockers for Openstack and kubernetes, leveraging upstream tests (https://wiki.opnfv.org/pages/viewpage.action?pageId=13211751).

These tests have been selected for CNTT verification gates

It means that developers are free to do what they want, but at the end we add a light abstraction in order to

  • launch the test the same way
  • include a good management of the depenencies self content in a docker
  • report the results the same way
  • push optionnaly the results in a test Database

As a testsuite provider, you shall provide the entrypoint, the list of needed dependencies and dclare your tests.

You need to amend 3 files:

  • a Dockerfile (to include your needed framework/test cade) - usually consists in installation/clon of repositories
  • a yaml file testcases.yaml to declare your use case(s): in this file you will have to declare the test driver (robot, python, bash, behave,..*)
  • a requirement.txt to indicate the dependencies

As an illustration, see https://gitlab.com/Orange-OpenSource/lfn/onap/integration/xtesting, you will see how the dockers are defined, built and shall be consumed by the CI chains.

And if you are able to run your test in 1 click, do not hesitate to contact the integration team to guide you on CI integration.

*: note that Xtesting is open source so if your favorite framework is missing you can add it

Xtesting references:

Contacts

Morgan Richomme

  • No labels