What Is Latency? | How to Fix Latency

Internet latency is the delay that occurs when users request web resources. Low latency is an important part of building a good user experience, while high latency can drive users away.

Share

Latency

Learning Objectives

After reading this article you will be able to:

  • Understand what latency is and what causes it
  • Explain the differences between network latency, bandwidth, and throughput
  • Learn how to reduce latency

What is latency?

Latency is the time that passes between a user action and the resulting response. Network latency refers specifically to delays that take place within a network, or on the Internet. In practical terms, latency is the time between a user action and the response from the website or application to this action – for instance, the delay between when a user clicks a link to a webpage and when the browser displays that webpage.

What is Latency?

Although data on the Internet travels at the speed of light, the effects of distance and delays caused by internet infrastructure equipment mean that latency can never be eliminated completely. It can and should, however, be minimized. A high amount of latency results in poor website performance, negatively affects SEO, and can induce users to leave the site or application altogether.

What causes internet latency?

One of the principal causes of network latency is distance, specifically the distance between client devices making requests and the servers responding to those requests. If a website is hosted in a data center in Columbus, Ohio, it will respond fairly quickly to requests from users in Cincinnati (about 100 miles away), likely within 10-15 milliseconds. Users in Los Angeles (about 2,200 miles away), on the other hand, will face longer delays, closer to 50 milliseconds.

An increase of a few milliseconds may not seem like a lot, but this is compounded by all the back-and-forth communication necessary for the client and server to establish a connection, the total size and load time of the page, and any problems with the network equipment the data passes through along the way. The amount of time it takes for a response to reach a client device after a client request is known as round trip time (RTT).

Data traversing the Internet usually has to cross not just one, but multiple networks. The more networks that an HTTP response needs to pass through, the more opportunities there are for delays. For example, as data packets cross between networks, they go through Internet Exchange Points (IXPs). There, routers have to process and route the data packets, and at times routers may need to break them up into smaller packets, all of which adds a few milliseconds to RTT.

In addition, the way webpages are constructed can cause slow performance. Webpages that feature a lot of heavy content or load content from multiple third parties may perform sluggishly, because browsers have to download large files in order to display them. A user could be right next to the data center hosting the website they're accessing, but if the website features multiple high-definition images (for example), there may still be some latency as the images load.

Network latency, throughput, and bandwidth

Latency, bandwidth, and throughput are all interrelated, but they all measure different things. Bandwidth is the maximum amount of data that can pass through the network at any given time. Throughput is the average amount of data that actually passes through over a given period of time. Throughput is not necessarily equivalent to bandwidth, because it's affected by latency. Latency is a measurement of time, not of how much data is downloaded over time.

How can latency be reduced?

Use of a CDN (content delivery network) is a major step towards reducing latency. A CDN caches static content to vastly reduce the RTT. (The Cloudflare CDN makes it possible to cache dynamic content as well with Cloudflare Workers.) CDN servers are distributed in multiple locations so that content is stored closer to end users and does not need to travel as far to reach them. This means that loading a webpage will take less time, improving website speed and performance.

Web developers can also minimize the number of render-blocking resources (loading JavaScript last, for example), optimize images for faster loading, and reduce file sizes wherever possible. Code minification is one way of reducing the size of JavaScript and CSS files.

It is possible to reduce perceived latency by strategically loading certain assets first. A webpage can be configured to load the above-the-fold area of a page first so that users can begin interacting with the page even before it finishes loading (above the fold refers to what appears in a browser window before the user scrolls down). Webpages can also load assets only as they are needed, using a technique known as lazy loading. These approaches do not actually improve network latency, but they do improve the user's perception of page speed.

How can users fix latency on their end?

Sometimes, network latency is caused by issues on the user's side, not the server side. Consumers always have the option of purchasing more bandwidth if latency is a consistent issue, although bandwidth is not a guarantee of website performance. Switching to Ethernet instead of WiFi will result in a more consistent internet connection and typically improves internet speed. Users should also make sure their internet equipment is up to date by applying firmware updates regularly and replacing equipment altogether as necessary.