Latency vs Response time in JMeter (Example + Diagram)

Latency is the time it takes for a request to reach the server and the time for the response to return to the user.

Latency simple means delay due to network or network delay.

Response time means the latency plus the processing time taken by the server.

Response time is the ultimate time to reach back the reply.

Latency vs. response time simple diagram
Response Time = Latency + Processing Time

Latency (Network Delay)

Let’s say the user is in UK and the server is in US.

User clicks on the login button of a website, now the request takes 2 seconds to reach the server (slow internet speed) and its response takes back 1 second to reach the user.

Latency will be 2+1 = 3 seconds.

Simple acknowledgment replies can be used to calculate latency.

Server Processing Time (server think time)

Difference between bandwidth & latency

Bandwidth is the max amount of data that a connection allows transferring.
In simple words, it is the max download and upload allowed by the network.
it’s calculated in megabits per second (mbps).
If you want to stream videos, you need high bandwidth.

Latency is the minimum time it takes for a request to reach a server and return.
In simple words, how responsive your internet is.
it’s calculated in megabits per second milliseconds (ms).
If you want to play online games, then you need high latency for a faster response because the packet size is tiny, so bandwidth won’t help much here.

Why first request takes longer time than subsequent requests.

For the very first request, the latency is longer because it includes 3 steps
DNS lookup:
TCP handshake:
Secure TLS negotiation: