...
For Performance Level 1, we have to define and measure our performance metrics:
Level 1: baseline performance criteria identified and measured (such as response time, transaction/message rate, latency, footprint, etc. to be defined on per component
I don't think we have to state what the metrics should be, it looks like we just define the metrics and measure them. For Level 2, an improvement plan must be created and implemented.
Although there should be measurements for design time and deployment time performance, here we focus on the run time performance metrics for policy execution in the PDPs, the most critical run time metrics for Policy. We propose to measure these metrics in a simulated environment and in a full ONAP deployment.
No. | Metric | Description |
---|---|---|
1 | Single Threaded Response Time | Measure the execution time for onset and abatement in each use case with only a single policy executing |
2 | Multi-Threaded Response Time | Measure the execution time for onset and abatement in each use case when the maximum number of threads are executing |
3 | Single Threaded CPU Usage | CPU Usage for each use case when executing alone |
4 | Multi Threaded CPU Usage | CPU Usage for each use case when executing in multiple threads |
5 | Single Threaded Memory Usage | Memory Usage for each use case when executing alone |
6 | Multi Threaded Memory Usage | Memory Usage for each use case when executing in multiple threads |
7 | Maximum Simultaneous Executions | Measure the maximum number of simultaneous policy executions that can be achieved whilst maintaining system stability and resource utilization |
8 | Installation Footprint | The requirements of the ONAP Policy Framework installation in terms of VM resources including the amount of disk space required for the database and for logs. |