Understanding Network Bandwidth vs Latency

By Cody Arsenault
Updated on June 10, 2021
Understanding Network Bandwidth vs Latency

The terms "network bandwidth" and "network latency" are sometimes used interchangeably, but they actually describe two separate concepts. Comprehending the difference between them is key to understanding frontend web performance. This guide explains how to take optimal advantage of network bandwidth while reducing latency so that you can deliver a fast and flawless user experience. We'll walk you through how to maximize your network's bandwidth while minimizing latency to achieve the best possible user experience (UX).

Bandwidth vs latency

A quick definition of both bandwidth and latency can be described as follows:

  • Latency is the amount of time it takes for data to travel from one point to another. It is dependent on the physical distance that data must travel through cords, networks and the like to reach its destination.
  • Bandwidth is the rate of data transfer for a fixed period of time, typically referred to in terms of “Megabits per second” (Mbps) or “Gigabits per second” (Gbps). As the term “bandwidth” inherently implies, it got its name because transfer speed used to be based predominantly on the (literal) width of a communication band!

When most people reference "internet speed," they often speak in terms of network bandwidth and in the units we described, of Mbps and Gbps. For example, a standard home Internet connection is 100 Mbps, but a data center may have several 10 Gbps lines! It's important to keep these definitions and units straight while researching services like content delivery networks (CDNs) or web hosts. These often charge clients based on varying formulae that heavily rely on the combination of inbound bandwidth (web traffic from other computers that your server receives) and outbound bandwidth (requests that originate from your server that are sent to other servers).

When most people talk about "internet speed," they usually talk in terms of network bandwidth. Bandwidth can also be interpreted as a traffic number. This is especially true when referencing services such as or web hosts which charge clients based on a combination of inbound/outbound bandwidth.

Bandwidth may be a huge factor in web transfer speeds, but there are far more factors that can make (or break) a connection speed! No matter how much data you can send and receive at once, it can only travel as fast as latency allows. Of course, this means that websites run slower for some users depending on their physical location, even if both the user and the server have excellent Internet connections. Figuring how to faster reach users from all points of the globe is what reducing latency is all about.

What is broadband?

Long ago, internet signals traveled through the same infrastructure as landline telephones. The Dark Age of Dial-up paved the way for the inception of broadband. Broadband is a high-capacity transmission technique that takes advantage of multiple frequencies to transmit large chunks of data simultaneously.

Today, most of the web's data travels through optical fibers, which are tiny tubes, thinner than human hair, that can transmit light. Metal wires are still sometimes used although they require more maintenance and are vulnerable to signal loss and electromagnetic interference. Optical fibers are currently one of the best transmission options for bandwidth because each fiber is capable of carrying several different wavelengths of light at once thanks to a process called wavelength-division multiplexing, or WDM. Most fiber optic cables contain at least four fibers, which greatly increases the amount of data flow possible. It would take thousands of copper wires to achieve the same bandwidth capacity, so all subsea and transcontinental journeys are made via fiber-optic links.

The millions of fiber links that make up the backbone, or the core data paths, of the internet can transmit hundreds of terabits every second. Bandwidth at the edges of the internet, however, is limited by deployed technologies and the performance of local routers.

The available bandwidth to the user is a function of the lowest capacity link between the client and the destination server.

- High Performance Browser Networking

Source: High Performance Browser Networking

The limitations of latency

Latency wasn't something that users or developers worried about in previous decades. Personal internet connections were much slower during the 1990s and early 2000s. For much of the world, there was a choice between using (very slow) Internet and using their landline phone service! The delay between sending a request and receiving a response was significantly smaller than the amount of time it took for downloads to complete.

Today, higher bandwidth connections have made downloads much faster, so latency often accounts for a greater proportion of wait time. For example, an image may take just 5 milliseconds to actually download, but it's typical for users to wait 100-150 milliseconds before receiving the first byte of data. In such cases, latency accounts for about 90 percent of the time it takes to request and download the image!

Cloud computing and mobile technologies have made it easier for web developers to reach a global audience, but they have also unmasked the limitations of latency. No matter the size of your network bandwidth, high latency can drag down an application's performance. Every 20 milliseconds of network latency adds between 7-15 percent to the overall page load time, so an excess of latency can quickly bring streaming videos to a halt.

5 Tips for optimizing network bandwidth

Many network managers have the seemingly impossible task of providing an optimal end-user experience while limiting operational costs. No matter how much network bandwidth you're paying for, here are some tips to help make sure you're getting your money's worth:

1. Understand the difference between bandwidth and throughput

These two terms are also used interchangeably, yet they have different meanings. Bandwidth refers to the size of your communication channel while throughput is the capacity of your processing system to send and receive data. Therefore, it's possible that your hardware isn't capable of utilizing your maximum bandwidth. Network World has a helpful article with additional tips for increasing throughput.

2. Weigh your performance tradeoffs

Poor performance isn't always caused by insufficient network bandwidth. Keeping track of how busy your links are can help you better understand the relationship between bandwidth and performance. For example, under-utilized links could be sucking away bandwidth resources from over-utilized links. It may be better to sacrifice one feature to give a performance boost to another.

3. Choose the right monitoring tools

Speaking of over-utilized network bandwidth, there are plenty of website monitoring tools that can help you figure out exactly how your resources are being allocated. Analyzing long-term trends isn't always helpful because they can mask utilization peaks. Likewise, looking at just the peaks only tells you which links are the busiest. Good monitoring software can give you a full picture to guide your optimization efforts.

4. Make sure bandwidth is used for business purposes

If you have a big team all working on the same network, the employees running Netflix on their desktops could be slowing things down for everyone including end-users. Surfing the web while you work is a common workplace habit these days, but try to limit internet use for recreational purposes while on company networks. If a business application is causing a lot of congestion, have your IT team determine if it can be optimized or if it should be removed from the network all together.

This is where Virtual LANs (VLANs) come into play. A VLAN is created when a logical segmentation is made that segregates traffic from certain sources; a certain amount of bandwidth may be dedicated to a certain machine group. This is often the cheapest and fastest answer to this rapidly growing issue.

5. Use proactive capacity planning

Even with the right monitoring software, keeping track of your link activity gets more time consuming as networks grow in complexity. Nonetheless, you should always make time for capacity planning. Don't worry about links that receive little attention; prioritize the links that are the busiest. Set up customized alerting that lets you know when bandwidth exceeds 80 percent for three consecutive minutes, and have a reaction plan ready. Otherwise, you could receive a 509 bandwidth limit exceeded error from your hosting provider.

If you're already using any of the major cloud hosts, you may want to look into Load Balancing. This feature often comes bundled free with hosting and allows you to automatically scale your server for demand. You'll only pay for bandwidth you use, so in terms of business, you'll never lose using this model.

5 Tips for reducing network latency

Remember, bandwidth is just half of the behind-the-scenes work behind a successful page load or web stream. In modern times, reduction of latency is arguably even more important for systems admins to keep in mind, especially when considering the responsiveness of the front-end portion of a site. Here are some pointers to help you reach users faster.

1. Know how visitors connect to you

Set up some analytical software such as Google Analytics to get more information regarding what type of devices your users are primarily using to access your site. If the majority of your traffic is coming from mobile devices, that information can help guide how you allocate web resources. This information is accessible with Google Analytics by navigating to Audience > Mobile > Overview.

2. Consider content delivery networks

Using a content delivery network, or a CDN, can reduce latency for static web pages and some other types of content. CDNs are networks of distributed servers that facilitate faster travel based on the user's geographic location. CDNs can also help improve other aspects of a website by providing increased security, improved reliability, etc. Read more about 10 reasons why to use a CDN.

3. Monitor and analyze network bottlenecks

Make note of your network traffic flows at different times of the day to pinpoint bottlenecks. You can reduce the latency of congested network nodes by adding more processing power or network adapters to a particular server. Reducing the number of nodes and centralizing network connection points can also cut down on latency.

Keep in mind that this does not mean it's your responsibility to stay awake 24/7 and keep track of your servers! Let modern software work for you. Most cloud services have built-in, free software that will show you charts of your peak usage times. You can usually configure these to automatically provision more infrastructure in advance on intervals to save money and time.

4. Know your cloud infrastructure

Admins should know exactly how data moves from in-house equipment, across various servers and finally to users' devices. Auditing tools like CloudSleuth and Stackdriver can help you chart the course of your data's travels. These services simulate real-world app conditions and run simultaneous transactions on servers set up in different locations worldwide. They then measure the response time of the application hosted at different cloud providers to create a map of the journey, which can steer your optimization efforts.

5. Other latency optimization tips

There are a few quick tips that you can implement to help further reduce latency.

Summary

Now that we are at the dawn of a new generation of networks, we can expect to see old challenges become a distant memory, but new ones will inevitably arise. Broadband has made the web much faster in comparison to what it was only 10 years ago, but we still have a ways to go before getting around the latency limitations with ease that prohibit truly lightning fast internet service.

  • Share

Supercharge your content delivery 🚀

Try KeyCDN with a free 14 day trial, no credit card required.

Get started

Comments

Comment policy: Comments are welcomed and encouraged. However, all comments are manually moderated and those deemed to be spam or solely promotional in nature will be deleted.
  • **bold**
  • `code`
  • ```block```
KeyCDN uses cookies to make its website easier to use. Learn more