What Does Latency Mean?
Latency is a perceived or actual delay in response time.
In networking, latency describes the delay in time it takes a data packet to travel from one network node to another. The term is also used to describe delays that can occur as data moves between a computing device's RAM and its processor.
High latency creates bottlenecks and is associated with low quality of service (QoS), jitter and a poor user experience (UX). The impact of latency can be temporary or persistent based on the source of the delays.
Latency on the internet is often measured with a network tool called Ping or a diagnostic command called traceroute. To minimize latency in application performance, developers can use cache engines and buffers.
Techopedia Explains Latency
In data communication, digital networking and packet-switched networks, latency is measured in two ways: one-way trip and round trip. One-way latency is measured by counting the total time it takes a packet to travel from its source to its destination. Round-trip latency is measured by adding the time it takes the packet to arrive back at the source. Unlike one-way latency, round-trip latency always excludes processing time at the destination point.
Causes of Latency
In network transmission, the following four elements are involved in latency:
- Delay in Storage: Delays can be introduced by reading or writing to different blocks of memory.
- Device Processing: Latency can be introduced each time a gateway takes time to examine and change a packet header.
- Transmission: There are many kinds of transmission media and all have limitations. Transmission delays often depend on packet size; smaller packets take less time to reach their destination than larger packets.
- Propagation: It's going to take time for a packet to travel from one node to another, even when packets travel at the speed of light.
Latency, Bandwidth and Throughput
Latency, bandwidth and throughput are sometimes used as synonyms, but the three terms have different meanings in networking. To understand the differences, imagine network packets traveling through a physical pipeline.
- Bandwidth describes how many packets can travel through the same pipeline at one time.
- Latency describes how fast the packets travel through the pipeline.
- Throughput describes the number of packets that can travel successfully through the pipeline in a given time period.
RAM Latency
Random access memory latency (RAM latency) refers to the delay that occurs in data transmission as data moves between RAM and a device's processor.
RAM latency can be manually adjusted using fewer memory bus clock cycles. Speeding up memory isn't necessary for most users, but may be helpful for gamers who prefer to overclock their systems.