Retrospective
What went well?
- A working prototype ready for Control Loop demo which can act as base.
- Communication between components, Test bed is ready.
- Teams understanding on control loop design is improved when compared to initial days.
- Team gained fair understanding on ONAP & Policy design and test environment, tools and procedures involved.
- Better defined and stronger code base at present. Less subject to structural change, also easier to onboard and follow for new members.
- As time went on commits became smaller and more frequent. Easier to code review, merge and pay more attention to.
- The idea of having a Wiki Page is good
- Demos are really good to show progress
Where can we improve?
- Integration tests between different components can be started early and in parallel.
- Code reviews to be strict and strengthened
- A code review checklist can be maintained (including checkstyles, coverage, tests done, functionality covered, pending work , definition of done etc), we pushed code that did not build and did not have tests.
- Internal demos can be started and made more frequent, when there is a demoable component.
- Dedicated page of instructions for new joiners for Checkstyle/SONAR etc. Also the repositories that we need to use Nordix/Onap etc
- Good knowledge of Policy and the Policy Framework codebase and practices work.
- Add good descriptions of the reason for a commit and an explanation of the commit in comments in the review
- How does our architecture align with the existing CLAMP springboot approach?
- Understand the architectural path into the future.
- Meetings!
Overall:
- Wiki page should be updated and kept aligned before design start, as component functionality can be agreed among all. +
- Jira tickets can be updated with new subtasks, an acceptance criteria defined and agreed per sub-task
- Jira Tickets can be updated with progress and gerrit links, if acceptable.
By Component:
Participant:
1. Integration testing should be started along lines of code, and should be a daily practice instead of just relying on Junits.
Commissioning:
1. Interaction between Commissioning and Instantiation should be defined more clearly.
Improvements
Overall:
Training of new joiners. Better information. Explain the concepts below in a good way with good diagrams and better documentation.
- Understanding of the TOSCA service template is critical.
- Understanding the Commissioning → Instantiation step
- Understanding the relationship between runtime and participants and how control loops and control loop elements relate to each other.
By Component:
- Commissioning:
- Get the unit tests done
- Probably introduce some unit ''integration tests'' with instantiation, CSITs
- Redefine the API's needed. Exact GET's needed + checks to introduce, What do we actually need?
- Pushing of control loop component definitions to participants? Flow of how you get from a commissioned control loop definition to an instantiated control loop.
- Monitoring:
- Monitoring and supervision functionality interchanged. Needs to be updated in the wiki.
- Need to finalize the statistics data for participants and Control loop elements. Also need to decide, if we require to implement association between statistics classes and parent.
- GUI for monitoring?
Considerations
PMSH implementation
Participant implementations for DCAE/Policy/CDS
Using existing CLAMP functionality to bridge between commissioning and instantiation.
Checking and validation
Refactoring Supervision Handling
Refactoring Participant as a JAR
Rough GUI/Client
Design time?
Architecture/Design wiki page, or official documentation?
Others?
Decisions
- Stick with the Wiki for documentation and convert to RST later, keep the wiki page up to date. Wiki page should be updated and kept aligned before design start, as component functionality can be agreed among all.
- Set up CSIT tests for the control loop work. Also get some help from experienced people in CSIT.
- Explain the concepts of Control Loops/Participants/Control Loop Elements in a good way with good diagrams and better documentation, also the TOSCA service template and node templates.
- Put up a howto on the Wiki on how to run the demo.
- Training on the Policy Framework and its principles.
- Informal Demos at the standups from everyone of commits coming in, maybe where there are issues!
- Get the Jenkins verify jobs running in Nordix
- Code Reviews
- Follow ONAP guidelines as a minimum Code Review
- Check and perform reviews
- First thing in the morning
- After Lunch
- Last thing in the evening
- Be gentle and kind
- Provide suggestions on changes rather than just saying something is incorrect: Do How as well as What and Why
- Checklist for Code Reviews (Sirisha Manchikanti to champion)
- Code must pass verify job, unverified jobs are not code reviewed
- Checkstyle: Covered by build (Verified commit proves checkstyle is OK)
- Coverage: 80%, Put figure from Eclipse/IntelliJ in the commit message: Investigate how to do in IntelliJ
- Integration Tests done statement
- What Functionality covered statement
- What Functionality is not covered (if needed)
- What the Definition of done for this feature is and how near we are to that
- Wiki page for new joiners, already is a lot of info out there but it's scattered. We should have a landing page for ourselves. Park this and come back
- Jira (Sirisha Manchikanti )
- Jira tickets can be updated with new subtasks, an acceptance criteria defined and agreed per sub-task
- Jira Tickets can be updated with progress and gerrit links, if acceptable.
Action Points
- Follow up on contract testing, also with CPS project test approach