Latency
In computing, "latency" describes some type of delay. It typically refers to delays in transmitting or processing data, which can be caused by a wide variety of reasons. Two examples of latency are network latency and disk latency, which are explained below.
1. Network latency
Network latency describes a delay that takes place during communication over a network (including the Internet). For example, a slow router may cause a delay of a few milliseconds when one system on a LAN tries to connect to another through the router. A more noticeable delay may happen if two computers from different continents are communicating over the Internet. There may be a delay in simply establishing the connection because of the distance and number of "hops" involved in making the connection. The "ping" response time is a good indicator of the latency in this situation.
2. Disk latency
Disk latency is the delay between the time data is requested from a storage device and when the data starts being returned. Factors that affect disk latency include the rotational latency (of a hard drive) and the seek time. A hard drive with a rotational speed of 5400 RPM, for example, will have almost twice the rotational latency of a drive that rotates at 10,000 RPM. The seek time, which involves the physical movement of the drive head to read or write data, can also increase latency. Disk latency is why reading or writing large numbers of files is typically much slower than reading or writing a single contiguous file. Since SSDs do not rotate like traditional HDDs, they have much lower latency.
Other types of latency
Many other types of latency exist, such as RAM latency (a.k.a. "CAS latency"), CPU latency, audio latency, and video latency. The common thread between all of these is some type of bottleneck that results in a delay. In the computing world, these delays are usually only a few milliseconds, but they can add up to create noticeable slowdowns in performance.
NOTE: It is important to not confuse latency with other measurements like data transfer rate or bandwidth. Latency refers to the delay before the data transfer starts rather than the speed of the transfer itself.