Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The testsuites usually include several tests(e.g. onap_vnf include basic_ubunutu, freearius_nbi and clearwater_vims, security includes 6 tests). Tests can be run sequentially or in parallel.

To be integrated in CI, you must integrate your tests in of of the defined test categories (see Integration categories & testsuites discussions):

  • infrastructure-healthcheck
  • healthcheck
  • smoke-usecases
  • candidate-usecases
  • security
  • benchmark

The testing part

1) Your tests must be fully automated...

Your tests must be fully automated. It means that it shall take into account the env on which it is executed, setup the resources, run the tests, delete the resources created for the test
if you are using non open source VNF, proprietary third components/equipments you will not be able to fully automate.

...

You can develop your test using any language (python, bash, robotframework, go..) and any framework.

Ideally use case owner shall be able to provide the automatic procedure at this stage.

Please note that this part is the most important and represents about 95% of the work...writing good tests requires a good knowledge of the system under test.

2) ...and integrated in a docker based on Xtesting

However your As any framework is allowed, at the end we may have a huge diversity of tests, test frameworks, test artifacts. In order to simplify the integration of the tests in any CI chains, we decided to leverage the OPNFV xtesting framexork.

Whatever your tests, they will be launched always the same way and generate consistent artifacts. As they will be dockerized, they will be portable in any CI/CD system.

Your test suites must then be "xtestingized", it means they must be embedded in a docker file leveraging Xtesting.

This task can be done by the integration team - the most difficult is the test itself... If writing the test takes 90% of the the time, building a xtesting docker is about 4%.


Xtesting is a light framework aiming to harmonize test inputs and outputs, which is very helpful for integration.It has been developped in OPNFV and already proposed several infrastructure test dockers for Openstack and kubernetes, leveraging upstream tests (https://wiki.opnfv.org/pages/viewpage.action?pageId=13211751).

...

*: note that Xtesting is open source so if your favorite framework is missing you can add it

The CI part

This last part represent the missing 1%, it mainly deals with declaration of the case in the DB and in the CI system.

1) Your tests must be declared in the DB...

for that you need to have access to the common test DB used by OPNFV and ONAP: testresults.opnfv.org

An account onap-integration has been created. Public SSH keys of the contributors have been pushed to grant access to this machine.

So to log in use 

Code Block
languagebash
$ ssh -i <key path> onap-integration@testresults.opnfv.org


Once log you can directly access to the database

...

Create a new use case

Code Block
languagebash
singleNodeRepl:PRIMARY> db.testcases.insert({'name':'basic_cnf', 'project_name': 'integration', 'tier':'smoke-usecases', 'version': '>guilin', 'description':'Test the creation of simple cnf (nginx)', 'domains': 'onap', 'tags':'cnf,multicloud,so,k8splugin', 'catalog_description':'basic cnf'})

it is obviously possible to update, delete using usual mongo commands.

2) ... and in the CI

sdsddsOnce everything is declared in the DB, you can add the test in the CI chain.

This is done in the xtesting-onap project hosted in gitlab.com (as the CI is based on gitlab-ci)

You can create modify the gitlab-ci config and create a merge request

As the tests are embbeded in a xtesting dockers, in theory the integration in CI is simple because already done for other tests (robot, bash, python, ...)

Edit https://gitlab.com/Orange-OpenSource/lfn/onap/xtesting-onap/-/blob/master/gitlab-ci/base.yml

Code Block
languagevb
...
.MY_NEW_TEST: &MY_NEW_TEST
  variables:
    run_tiers: xtesting-smoke-usecases-robot
    run_type: my_new_test_declared in the xtesting entrypoint (shall be declared also in the DB if you want to push the results to the DB)
    run_timeout: 1200
  stage: smoke-usecases
  allow_failure: true
  <<: *get_artifact
  <<: *runner_tags
  <<: *run_smoke_usecase_robot
  <<: *manage_artifacts

...

MY_NEW_TEST:
  <<: *MY_NEW_TEST
  only:
    refs:
      - triggers



You can assign the test to one of the declared stage/category (infrastructure-healthcheck, healthcheck, smoke-usecases, onap-security)

Xtesting references:

...