Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Description

Motivation

The most obvious solution for taking full advantage of CSIT is to add the test cases under the same repository with the functionalities that they are testing (instead of having them in separate centralized CSIT repository as we currently have). This would have the following benefits:

  • CSIT could be triggered by any commit to the project repo
  • CSIT tests the code (or specifically, docker images that have been built) from the committed branch
  • CSIT could have a vote on the commit based on the result of the test run
  • If the implementation changes require changes in CSIT tests (to cover new functionality or to pass in the first place), that could be handled within the same commit
  • ideally, local verification would become less complex (no need to work between CSIT repo and project repo) 
  • No need of Integration for the integration team to merge any changes related to your project

...

  • CSIT suites that test components from multiple project repositories at the same time 
    • such CSIT tests may have to be separated using additional simulators, or
    • project repository structures themselves may have to be reconsidered, or
    • the possibility of combining branches from multiple repositories under the same commit needs to be provided (if gerrit allows?)
    • In any case, the division between the images under test and images that are just necessary dependencies should be clearly made and documented
      • the images under test should be coming from the commit branch
      • in the case of necessary dependencies it must be decided whether they should be provided as simulators or real components
        • if provided as real components, they should be referred to with a fixed, unchanging released version number and should be stable and mature enough to develop on
          • ONAP's (unintentional?) practice of allowing the same versioned image to be changed might prove is problematic
      • project-specific simulators that are built on the fly are trivial case from dependency handling point of view, but if common simulators are in use, dependency considerations for them are the same as with real components
  • Jenkins templates may have to be redesigned to support unified approach for triggering review branch-specific artifact and docker buildbuilds, CSIT execution and voting chain as part of review verification
    • What is the most elegant and effortless solution for ensuring that the docker images built from commit branch are taken in use in the CSIT (both in Jenkins and in local development environment)?
    CSITs would
    • The redesign should also allow testing locally built docker images in local environment with CSIT as effortlessly as possible
  • CSITs will become blockers for merging code
    • local pre-commit verification should be supported better by common CSIT tools
    • are all projects and suites mature enough to deal with that?

  • Docker image production practices will have to should be unified (see Docker Image Build Guidelines, Independent Versioning and Release Process and Release Versioning Strategy)
  • To which extent are the current guidelines actually followed?
  • Are the guidelines good enough for CSIT development as they are now if they are followed?

Technical details to be decided

...

  • Reusing existing jobs and chaining them would require some docker image tag tuning to make sure CSIT tests pick up the exact image that was produced by preceding docker build job 
  • Either way, JJB templates will have to be touched
  • How should the following issue be solved?
    • give unique tag to the docker image to be tested in review and pass it to CSIT job
      • what should the tag be?
        • timestamp (from maven.build.timestamp) already seems to be used, but how to extract it?
        • jenkins build identifier? That can be determined from triggered CSIT job and it would be a direct way to find out where the image came from afterwards also to human reader (I'm intending to use this in my initial PoC with CCSDK)
        • gerrit commit id that triggered the job?
        • sha-256 of the docker image? 
      • how to pass the parameters? 
        • triggered jenkins build identifier can be found from ${BUILD_URL}/api/json (if triggered at all)
        • reverse trigger mechanism used as the basis of current trigger_jobs doesn't seem to support parameter passing?
        • file-based or some other custom mechanism?
        • replace reverse trigger mechanism with normal trigger with parameters (i.e. define the trigger in the triggering job instead in the triggered job)?
    • Do we need new docker image job templates for in-review docker builds or can the existing ones be reused somehow?
    • Is it possible to chain triggered jobs and let them all vote on the original review or does it need an umbrella job?

...

Technical decisions

  • New templates have been introduced for all verification steps (see Project-specific CSIT structure ) while all the existing ones have been left untouched
    • review-verification template
      • triggered by review commit
      • has verify vote on the review
      • triggers artifact builds
        • Should maybe include also Sonar analysis in the future - those are currently still completely separate jobs executed only on merged code
      • triggers docker image build - different types of docker builds each require their own templates (identified and separated by "artifact-type"), and so far we have them for
        • maven docker build
        • golang docker build
        • images are not pushed to Nexus3
      • triggers CSIT execution that tests the produced docker images
    • merge-verification template
      • triggered by merge
      • triggers artifact build in the same way as review-verification
      • triggers docker image build in the same way as review-verification but builds the images from master
      • triggers CSIT execution that tests the produced docker images in the same way as review-verification
      • No images are pushed to Nexus3 as a result of merge-verification either; publication of new images still remains the responsibility of separate docker staging etc. jobs 
  • Execution of CSIT tests and incorporating locally built test images should be made as easy as possible following common guidelines
    • Setting up testing environment (specific Local docker build instructions should be maintained and easily accessible (for example, as part of CSIT README.md in the project repository)
    • Test environment setup should be as automated as possible (project-specific dependencies should be handled by the setup scripts)
    • Specific Need for specific environment variables expected by the test suite (like , for example, GERRIT_BRANCH)
    What about the Java/Python/etc. SNAPSHOT/STAGING/RELEASE artifacts in Nexus? Do they have any actual role, or are the docker builds always creating those artifacts for themselves on the fly (against current Docker Image Build Guidelines)?
    • What about code coverage/Sonar? Apparently should be minimized and preferably avoided altogether  
  • Artifact builds should take care of code coverage/Sonar analysis
    • apparently currently there are no common templates dealing with Sonar? (instead, all the project have their custom Sonar JJB definition), and
    • all the Sonars run on daily schedule instead of being triggered
    • are projects mature enough to define the various Sonar violations as review blockers?
      • can we give each project the power to define their own quality gates? 

Project status and readiness at the end of Honolulu

Project status and readiness at the end of Guilin

...