You can find a Stress Test Summary on the Stress Test Report page under the Summary tab.

Outline

On the test status bar at the top of the report page, the test details are presented including the number of errors occurred while test execution. To view the details of the errors, go to the Session Report tab and filter the list by the Failures Only status.

To download the list of the LIs with IP addresses, click Agent list.

Charts overview

The vertical Y-axis on each chart represents some test execution parameter depending on the chart type. The horizontal X-axis shows the test duration time in minutes. 

Execution plan

The line chart shows the change in the actual number of virtual users over the testing period vs. the expected number of virtual users based on the test scenario. You can view if the test execution plan, in terms of user load, was implemented successfully. 

The Y-axis represents the number of virtual users. 

The following lines are displayed on the chart:   

  • Max.Number of Users – the predefined maximum number of virtual users that can be simulated at one time on the target resource.  
  • Actual Number of Users – the number of users actually simulated on the target resource per time interval. Each node represents the total number of users simulated during a particular period.  The first node represents the starting number of users.  
  • Expected Number of Users – shows the predicted change in the number of users to reach the planned maximum load. 

If the Actual Number of Users line reaches the Max.Number of Users line, all virtual users allocated for the test were used and the site was tested under the planned maximum load. On the other hand, an issue takes place if there is a large deviation between these two lines.  

The possible reason why Max.Number of Users was not reached are:

  • LIs could hit CPU usage limit. Check the Load Injector Server Load chart in the test report.
  • In the case of goal-based testing, Transaction goal could be reached with the fewer number of users than specified by Max.Number of Users. Check the Transactions per minute chart in the test report.

Transactions per minute (Goal-Based Stress Test only)

The chart reflects goal vs. actual number of transactions per minute at each test iteration

The Y-axis represents the number of transactions executed per minute on the target resource. 

On the chart you can find the following lines:

  • Transactions (Planned) – the goal number of transactions per minute set in the test scenario.
  • Transactions (Actual) – the actual number of transactions simulated on the target website at each iteration. The line is separated with the calibration periods.
  • Calibration periods – a time needed to reach the next level of user load. On the chart, it is displayed as a dimmed gap between actual load periods. 

 

Transactions per sec chart

 If the goal was not reached while the Actual Number of Users line on the Execution plan chart overlaps the Max.Number of Users line, increase the Max. Simultaneous Virtual Users value in the test settings and repeat the test.

Average Response Time

The Average Response Time chart shows the change in the actual duration of transactions (time for browser initialization and user behavior delays are not included). The Y-axis represents the time in seconds.

Each node on the chart represents the Average Response Time value calculated for the corresponding interval.

If there are no significant line fluctuations on the chart, your web site handled the test load successfully.

In the case of dramatic growth or decline in Average Response Time over the test or a sharp peak, check the other charts for details. For example, on the charts below, dramatic growth in Average Response Time caused by a progressive increase in the number of sessions with errors. 

The Number of Sessions Started chart allows comparing the total number of sessions started by concurrent users and the number of failed/successful sessions. The Y-axis shows the number of sessions.    

A session includes a browser launch/shut down and a transaction by itself.

On the chart you can find the following lines:

  • Number of Sessions Started – the total number of sessions started at a particular time interval. 
  • Number of Success Sessions – the number of sessions that were run without errors, i.e., all tasks of the device were executed successfully. 
  • Number of Error Sessions  – the number of sessions with failures (no keywords/image found, failed to access the target resource, etc.). 

The Cumulative Sessions Count graph shows the total number of sessions started over the test. The chart allows assessment of the total number of virtual users simulated on the target resource over the test period. The Y-axis shows the number of sessions.

Each node represents the total count, calculated as a sum of sessions started by the time of calculation.

The Number of Errors by Errors Type dot chart illustrates the number of error sessions by error type. The number is specified on the Y-axis.

Use the chart to determine which error types predominated during a specific moment. See the Session Report to review failures. For error description, check Error Codes.

 

Load Injector Load

The chart shows CPU load metrics received from Dotcom-Monitor LIs. Use it to assess how your test affects the performance of LIs running the test from different geographical zones. 

Make sure LIs load levels are not higher than 80%. Based on our expertise, the optimal CPU load for the Load Injector Server machine is 80% or less. This load level excludes the server hardware delays that influence the server performance and slow down data processing.         

If LIs CPU load exceeds 80%, it is recommended to decrease the Payload value and repeat the test. Otherwise, the test will show inaccurate results. 

On the other hand, if the CPU usage level is low, you can increase the Payload and the maximum number of virtual users to use the LoadView resources optimally and minimize the cost. 

 The first switch under the chart always represents the Reference Server line.

Tips on Interpreting Results

All charts are synchronized with one another by the time axis, so vertical lines on the charts are aligned. Therefore, you can review the test execution history from chart to chart and see how the number of simulated users impacts the response time and LIs CPU load at a particular time period. On the example below, the LI was not able to generate requests and accept website answers fast enough because it simulated too many virtual users and CPU usage maxed out.

Nodes on the charts are active, so clicking a node will open its infotip. To see detailed information for the node-related sessions, click Show Details at the bottom of the infotip.  The report for sessions running at the selected period of time will be displayed on the Session Report page.  

 

To facilitate a graph analysis and filter lines on it, use the switchers under the graph.