Many of us are used to thinking about the quality of our internet connection in terms of speed: a gigabit connection (carrying 1000 megabits of data per second) is the latest must-have. But these speeds — proudly advertised by internet service providers — do not tell the full story about how good our connections are. The other half of the story is latency, or the responsiveness of our connection. Where speed or bandwidth is the amount of data you can receive in a second, your latency is the rate at which data can get to you. This level of responsiveness measures how powerful our internet connections are: a latent, non-responsive internet connection will make even the fastest proclaimed speeds feel slow. In other words, it doesn’t matter if you can download 1000 megabits of data per second if each piece of data takes a long time to reach you.
Wait, what are we talking about?
What the industry has classified as internet speed is in fact the volume of data that can be transferred every second. The more data that can be transferred per second, the easier it is to stream videos (download) and send large files to friends (upload). This is also known more accurately as bandwidth. It’s important to note, however, that the majority of internet connections in the U.S. run on copper wire or cable, and do not hit the maximum possible data transfer. In other words, the advertised speeds are often a “best-effort” promise.
Latency is how long it takes each car, on average, to get from A to B. If you load a website with a high speed, high latency connection, there would be a long delay before the page would start downloading and then it will show up almost at once. With a high speed, low latency connection (the ideal), the same website would appear immediately and download all at once.
Why is latency important?
You might have a 6 lane highway (good bandwidth!), but if you’re driving in a slow car it will still take you a long time to get to your destination.
Some tasks, like playing online games or video calling, require the rapid exchange of data between two points. In these cases that require quick-fire interactions, good bandwidth doesn’t get you all the way and latency becomes vitally important. If you’re gaming on a connection with significant latency (low responsiveness), your actions would be delayed, and events happening in the game would also have a noticeable delay rather than feeling near-instantaneous. Similarly, if you’re video chatting with someone on a high-latency connection, you will be out of sync and would struggle to avoid talking over each other as a result of the delay.
Looking ahead, only powerful internet connections with low latency will deliver a future with the capacity for fully distributed medicine. These connections — with the power to transmit vast amounts of data both ways instantaneously — are essential to move beyond the basics of video calls with care providers, to real-time monitoring akin to being inside a hospital, and the ability to age in place with medical-grade devices connected to doctors right inside the home. Telemedicine will improve access to care and quality of life, and it has also been estimated that it could cut healthcare costs by tens of billions of dollars.
How can we build high bandwidth, low latency connections?
Bandwidth is governed primarily by the composition of the physical network — whether it’s made from copper wire or fiber-optic cables. As Harvard Professor Susan Crawford illustrates in her must-read new book, Fiber, the difference between copper and the upgrade to fiber is the difference between how much water (data) can be transferred through “trickling garden hose” and a “15 mile wide river.” A network of fiber-optic cables also provides symmetrical bandwidth — the same capacity for download and upload.
Latency, or connection responsiveness, is also determined in large part by the material of the network. Fiber-optic networks can transfer vast amounts of data (bandwidth) at near the speed of light (low latency). Copper networks require electronics to be placed more frequently along the network to fix signal loss, and each one of these electronics has a computing time that creates more latency. Fiber networks have fewer of these stops along the way, and therefore fewer delays, because light can travel greater distances with less signal loss over the pure fiber-optic cables. Of course, the quality of the electronics will also play a part, as will the construction of the network when distances between the source and the end internet user can be reduced.
We have been made to believe that good bandwidth — how much data could be transferred every second — translates to a fast internet connection. In reality, latency — how quickly that data arrives to you — is an equally important factor for a powerful internet connection. Only with a high bandwidth and low latency connection can we receive vast amounts of data almost instantaneously, and build the critical foundation for all next-era services, such as 5G, telehealth, and entertainment.
To close the digital divide, and ensure the continued economic output and wellbeing of our nation’s communities , we need truly powerful internet connections. And that means fiber-optic infrastructure as an essential replacement to copper wire or coaxial cable. We must take our future in our own hands and make open access fiber-optic infrastructure the defining infrastructure investment of our time. This critical change to the status quo will save us money, supercharge our economies, and improve quality of life for everyone.