I know that this question has been asked here earlier but I am still not able to figure out what is the significance of the average,min,max and throughput
parameters in the Jmeter summary report ?
Here are is JMeter setup:
No. of threads:5000
Ramp-up period : 1
Loop Count: 1
Results :
Average:738
Min:155
Max:2228
Throughput:60.5%
So does that mean that my 5k requests took 738 milliseconds(0.7 s) to complete ? or it means that every single request took 0.7s to complete ? Similar, what shall be the interpretation for min and max parameters.
The Throughput: is the number of requests per unit of time (seconds, minutes, hours) that are sent to your server during the test.
The Response time: is the elapsed time from the moment when a given request is sent to the server until the moment when the last bit of information has returned to the client.
The throughput is the real load processed by your server during a run but it does not tell you anything about the performance of your server during this same run. This is the reason why you need both measures in order to get a real idea about your server’s performance during a run. The response time tells you how fast your server is handling a given load.
Average: This is the Average (Arithmetic mean μ = 1/n * Σi=1…n xi) Response time of your total samples.
Min and Max are the minimum and maximum response time.
An important thing to understand is that the mean value can be very misleading as it does not show you how close (or far) your values are from the average.For this purpose, we need the Deviation value since Average value can be the Same for different response time of the samples!!
Deviation: The standard deviation (σ) measures the mean distance of the values to their average (μ).It gives you a good idea of the dispersion or variability of the measures to their mean value.
The following equation show how the standard deviation (σ) is calculated:
σ = 1/n * √ Σi=1…n (xi-μ)2
For Details, see here!!
So, if the deviation value is low compared to the mean value, it will indicate you that your measures are not dispersed (or mostly close to the mean value) and that the mean value is significant.
An example is always better to understand!!! I think, this article will help you.
About the average - 738 or 0.7s means that for example if you have one request with two thread and the first thread completed in 0.9s and the second one took 0.5s to complete this is the average response time of all requests that you have made (same for your 5000 users if for example they are executing 1 request each - the average response time will be the sum of all response times divided by number of threads (5000)).
Min and Max are the minimum and maximum response time that was logged as a result by all of the requests so if 5000 threads are executing again 1 request one of the samplers response time was 155 milliseconds and another one was 2228 milliseconds.
About the throughput - it signifies number of transactions or requests that can be made in a given period of time. It is a useful measurement to check the load capacity of the server. Throughput = (number of requests) / (total time).