Machine learning capabilities in DevOps Test Hub
HCL DevOps Test Hub (Test Hub) uses Machine learning (ML) algorithms to analyze tests that match certain requirements and criteria, and then presents the findings as insights and recommendations after the tests are run. You can use the insights and recommendations to interpret the test results and identify the problem areas in the system under test.
Test Hub has incorporated ML capabilities to provide an automated means to identify problems in the system under test.
Tests analyzed through ML
- Performance tests
- Schedules
Parameters analyzed through ML
- The Response Time Lock-Step Pattern parameter.
- The Response Time Standard Deviation Pattern parameter.
- The Throughput Drop Pattern parameter.
Test Hub uses different analyzers for each of the parameters. For example, Test Hub uses the Response Time Lock-Step Pattern analyzer to analyze the tests for the Response Time Lock-Step Pattern parameter.
Criteria for analysis of the parameters
Parameter analyzed | Criteria for analysis |
---|---|
Response Time Lock-Step Pattern | The analyzer identifies the
Response Time Lock-Step Pattern
parameter in the overall page response time observed against the
user count based on the following criteria:
|
Response Time Standard Deviation Pattern | The analyzer attempts to detect the response time of pages that are more than thrice the value of the standard deviation calculated for the page response time as the Response Time Standard Deviation Pattern parameter. |
Throughput Drop Pattern | The analyzer attempts to detect
sudden drops in network throughput as the Throughput Drop Pattern parameter
that is based on the following criteria: Note: Sudden drops in throughput might be
related to the performance tool itself, issues with network
connectivity, or issues with signal-scalability of the
system under test.
|
Analysis at the project level
When you as a Project Owner or Tester configure a run of a Performance Test or a Schedule from your project in Test Hub, you must ensure that the criteria required for ML analysis are met. During the test run, Test Hub uses its ML capabilities to analyze the supported parameters and presents its findings as insights on the Insights page.
Any member of your project can view the insights by clicking Insights section on the Overview page in the project.
from the navigation in the project or from theProject members after viewing the insights can react by either agreeing or disagreeing with the insights suggested by Test Hub.
Analysers in a team space
You can view the list of analysers and description about the analyzers available in a Team space by clicking
from the navigation in the team space. Also, you can enable or disable the analyzer, view the count of the reviewed insights and the review history of the insights generated by the analyzers.After viewing the recommendation for a specific parameter, members of the project can alter the default threshold value that specifies the level of confidence required in the analysis of the parameter and run the tests again to improve the accuracy of the recommendations and insights.
You can also improve the accuracy of the recommendations by running multiple tests with differing loads repeatedly.
Using Machine learning capabilities
- To understand the requirements of Performance tests or Schedules that are analyzed by using ML capabilities in Test Hub, see Test run considerations for using Machine learning capabilities.
- To configure a run of a Performance Test, see Configuring a run of a Performance test.
- To configure a run of a Schedule, see Configuring a run of a Rate Schedule or VU Schedule.
- To view the ML insights into the Performance tests or Schedules in your project, which were analyzed after a run, see Viewing Machine learning insights. You can also select your reaction to agree or disagree with the insights provided when you view the insight.
- To understand how to interpret the insights, see Interpretation of insights.
- To view the ML analysis of the supported parameters when Performance tests or Schedules that are in different projects in your team space are run, see Viewing Machine learning analyzers.
- To understand how to interpret the recommendations of the ML analysis of the supported parameters, see Interpretation of parameter analysis.
- To view the reactions of project members to the ML insights, see Viewing reactions marked by members to Machine learning insights.
- To modify the settings of the ML analyzer for a specific parameter, see Modifying Machine learning analyzer settings.
- To delete the ML insights provided about the Performance tests or Schedules that were analyzed, see Deleting Machine learning insights.
- To rerun the ML analysis without rerunning the test asset, see Rerunning Machine learning (ML) analysis.