Pythonsdk-test wrapper for simulators
Note: This page is for discussion and aims to define the possible requirements for a simulator wrapper within pythonsdk-tests
@Andreas Geißler @Michał Jagiełło @Illia Halych @Krzysztof Kuzmicki
When we launch E2E tests, we sometimes need to launch third party simulator (e.g. pnf-simulator).
It would be a cool feature if we could have a wrappers within the pythonsdk-test to be able to start/stop/configure/launch REST API as a prerequisite steps.
It obviously depends of the simulators...and their ability to be controled and offer a REST API for the test.
We may however imagine to define an API that will allow the pythonsdk-test to consume such simulators.
Any simulator
shall be hosted in dedicated repository (do one thing and do it well)
shall include a docker build chain and be runnable trough a docker run command
shall be available in the ONAP Nexus
Though the docker standard command, it shall be able to
start the simulator
start with specific configuration
stop the simulator
If the feature is available, a REST API shall be available to reconfigure/trigger workflow.
The wrapper step within the pythonsdk-tests shall
detect if the simulator is available or not (if not test shall fail immediately exception simulatorNotAvailable)
launch the simulator
get the simulator status
stop the simulator
exchange with the simulator if an API available
Open questions
use of submodule from pythonsdk to know the available simulators? customize configuration?
The "scalable steps" solution
Concept: INT-1812: Implement the "scalable steps" solution for simulator wrapperClosed
Implementation: INT-1829: A wrapper for simulatorsClosed
Author: @Illia Halych
Problem: all simulators are different, no trivial solution for all.
Challenges:
Include general functionalities, specific functionalities, abstractions.
Make it flexible to extend the wrapper based on specific needs.
Patch custom configurations.
Cleanup when something fails.
Concept:
Use a step-by-step execution that's already available in pythonsdk-testsand implement the simulator as a step-by-step process.
The step-by-step process execution will allow to patch configurations to each independent step separately before they start.
The step-by-step process execution will allow to "rollback" (cleanup) from the step where the problem occurred.
The step-by-step process execution is capable of changing the order of steps and dropping certain steps.
The first (1) basic step has the lowest level of abstraction - most common functionalities for all.
The zero (0) basic step would be a substitution for the first step (1) for complex models.
The third (3) basic step has the highest level of abstraction - least common functionalities for all.
Solution:
Simulators' images and other setup are described in helm charts. Charts should be stored in a remote repository and can be installed together with dependencies.
Avionix officially supports Helm 3 only.
STEP LEVEL 2 supports HTTP/HTTPS API calls.