Pythonsdk-test wrapper for simulators

Note: This page is for discussion and aims to define the possible requirements for a simulator wrapper within  pythonsdk-tests

@Andreas Geißler @Michał Jagiełło @Illia Halych @Krzysztof Kuzmicki



When we launch E2E tests, we sometimes need to launch third party simulator (e.g. pnf-simulator).

It would be a cool feature if we could have a wrappers within the pythonsdk-test to be able to start/stop/configure/launch REST API as a prerequisite steps.

It obviously depends of the simulators...and their ability to be controled and offer a REST API for the test.

We may however imagine to define an API that will allow the pythonsdk-test to consume such simulators.

Any simulator

  • shall be hosted in dedicated repository (do one thing and do it well)

  • shall include a docker build chain and be runnable trough a docker run command

  • shall be available in the ONAP Nexus

Though the docker standard command, it shall be able to

  • start the simulator

  • start with specific configuration

  • stop the simulator

If the feature is available, a REST API shall be available to reconfigure/trigger workflow.

The wrapper step within the pythonsdk-tests shall

  • detect if the simulator is available or not (if not test shall fail immediately exception simulatorNotAvailable)

  • launch the simulator

  • get the simulator status

  • stop the simulator

  • exchange with the simulator if an API available

Open questions

use of submodule from pythonsdk to know the available simulators? customize configuration?



The "scalable steps" solution

Concept: INT-1812: Implement the "scalable steps" solution for simulator wrapperClosed

Implementation: INT-1829: A wrapper for simulatorsClosed

Author: @Illia Halych

Problem: all simulators are different, no trivial solution for all.

Challenges:

  1. Include general functionalities, specific functionalities, abstractions.

  2. Make it flexible to extend the wrapper based on specific needs.

  3. Patch custom configurations.

  4. Cleanup when something fails.

Concept:

  1. Use a step-by-step execution that's already available in pythonsdk-testsand implement the simulator as a step-by-step process.

  2. The step-by-step process execution will allow to patch configurations to each independent step separately before they start.

  3. The step-by-step process execution will allow to "rollback" (cleanup) from the step where the problem occurred.

  4. The step-by-step process execution is capable of changing the order of steps and dropping certain steps. 

  5. The first (1) basic step has the lowest level of abstraction - most common functionalities for all.

  6. The zero (0) basic step would be a substitution for the first step (1) for complex models.

  7. The third (3) basic step has the highest level of abstraction - least common functionalities for all.

Solution:

  1. Simulators' images and other setup are described in helm charts. Charts should be stored in a remote repository and can be installed together with dependencies.

  2. Avionix officially supports Helm 3 only.

  3.  STEP LEVEL 2 supports HTTP/HTTPS API calls.