Time to Celebrate: The Web Grows Up
Raise Your Glasses To The Future King: HTTP/2
A well catered meal can feel as choreographed as a ballet; every dish arriving on cue, adding to the melded and wondrous experience of fine food and rich conversation. Empty appetizer plates swim swiftly off into the air as they are replaced by soups then light proteins or freshly sliced cheeses leading forward in a finely orchestrated fashion to the main course before finally the crescendo of the evening climbs back to earth as conversations grow more intimate and tussled napkins give way to pies, pastries, and healthy pours of deep exotic coffees and teas.
Our experience of the web should similarly match the nuance, precision, and delight of both a finely tuned ballet and a well hosted meal. Instead, it can often feel like being screamed at by a stranger wearing paper bags for shoes while stuck in an endless DMV waiting line, and like working with a sloth running a DMV line, the speed of our interactions on the web depend far more on latency than bandwidth.
We have a tendency to enjoy the bliss of ignorance until something goes wrong, when perhaps we ought to try instead to notice precisely what is going even subtly right.
Soon, your daily experience of the web, along with the experiences of your clients, coworkers, and even the stranger from the DMV, will likely to begin to feel serendipitously lighter.
It is my prediction that this will mean greater relational integrity across your online interactions, a far more graceful experience procuring the information you seek, and perhaps even a near providential ability to capitalize upon opportunities and achieve goals.
The very makeup of the web itself is undergoing its most significant and intentional change ever, and the result, if capitalized upon, promises to change the way we utilize it as well as, for the first time, the way it responds to us.
The Web of 2017
When we study an ornate gown, we are not often aware of the considerable time spent stitching, snipping, gluing, folding, creasing, pressing, and pinning multiple disparate pieces of cloth into one seemingly continuous textile. We do not watch the Oscars and remark at how satin fades into velvet, we simply comment on the piece as a newly joined whole.
Similarly, although we at times afford ourselves permission to refer to the the web as a place, it is in actuality a material from which many things and places are created — from social networks to shopping sites to online encyclopedias.
The nature of this material is that it is not one continuous fabric, but rather a creatively layered and joined assemblage of evolving and interchangeable technologies each pulling on one another or lending themselves to some specific task.
As the web has developed, however, these three broad categories have ballooned into hundreds of assets for branding and design, tracking data, accommodating older devices, providing animation, reducing browser load time, modifying image sizes, delivering appropriate content, delivering social content, importing calendars, allowing real-time interaction…
Most modern websites could justifiably be labeled enormous, if not for their data footprint then merely for their complexity. It takes hundreds of requests to assemble a single modern web page and yet most modern browsers can only request four to eight of the resources they need at a time and spend nearly half their load times just waiting for resources after requesting them.
The HTTP protocol is the framework to which all these technologies are sewn, and yet, it is so outdated at this point having entered into service in 1997 that it cannot support the weight of the modern web. Thus, a change is inevitable for the sake of the current web as well as for the sake of the continued innovation and improvement of the future web.
Time for the Right Change
The problem with HTTP is that it was not designed to handle the number of requests needed to create a modern web page. Using it is like calling your mother and after each sentence hanging up only to dial again, exchange salutations all over, and state your next sentence. This does not mean, however, that we should give up and resort to carrier pigeon, or smoke signals, or bringing back Morse code.
HTTP leverages the underlying TCP protocol (rules for communication) to open a connection, send a message, and close the connection.
With iteration 1.1 this specification grew tolerant of multiple requests via the same connection but the limits of these requests are paltry compared to the demands of today’s applications and websites. Even tacked on and hurried ideas like connection persistence and HTTP pipelining fell short of their goal and gave way to an even more fragmented landscape with counter-solutions like SOAP.
Though we’ve distanced ourselves to a degree from the days of heavy-handed workarounds like SOAP, we nonetheless have been forced as developers to resort ever since to tricks to ensure performance of websites despite working with an odd and insufficient underlying infrastructure.
Things like removing all the spaces from our code, daisy-chaining assets, fusing images together into malformed graphical gollums, or one of many clever ways to “creatively leverage” browser cache or selectively refresh content to avoid the wrath of making additional HTTP requests — all to sidestep the underlying problem of the protocol’s limitations themselves.
The quest for performance has taken the development community so far that some have pushed for a resurgence of static site generators to get away from relying on server architecture entirely. Opponents have argued this could hurt the progress of the Open Source Software community and almost anyone would agree that widespread adoption of this practice would render the web feeble. Performant, but feeble.
The irony is that our workarounds have only been distractions from the underlying problem in the very architecture that undergirds web development as a whole.
Time to Get Out of the Weeds
Returning to the restaurant analogy, the way HTTP 1.1 handles requests in chunks would be tantamount to your waiter coming to your table to take your order then returning 20 minutes later to take the order of your significant other, then twenty minutes later to ask if you wanted anything to drink…
It’s no surprise that this absent minded waiter has difficulty in a number of other areas as well since he never writes anything down and instead relies on a pneumonic device involving state capitols.
“You have a mole on your earlobe that looks like Lansing Michigan, Michigan is blistering so you…ordered…CHICKEN PARM!”
Though a valiant effort by a “rushed” (and now redeemed) team of developers working part-time on an uncertain set of standards, HTTP 1.1 has not aged well for many reasons:
Request and Response Relationships are Opaque
HTTP 1.1 cannot distinguish which responses belong to which requests unless they return in exact mirrored sequence. Even with the advent of Pipeline, which never quite caught on due to poor and subjective support, large requests would hold up the entire chain.
When your request is a number and you are identified only by the number of your request, you feel closer to a DMV vending machine than fine dining. If a waiter referred to you solely as Table 3, you would not be pleased. Similarly, if anyone referred to me as “Chicken Parm Guy” — I would consider them to be uttering fighting words. There is a better way to return data than strictly sequentially.
Transmission of Redundant information
HTTP 1.1 by default transmits identical header information back and forth to the server at a performance cost of up to 40–60% of the total request. There are methods to remove these duplicate headers, but they are complex and unnecessary.
This is like your waiter asking your name each time you interact, indicating either he estimates you to be of very low importance or that he has precisely as robust a short-term memory as a pair of sandals.
Responses Inevitably Bottleneck
When the web-server begins to return items based on requests, the responses cannot be delivered because requests further up the chain have not been fulfilled yet. Sometimes this is because those requests are very large, but in any case, it is a major inconvenience and not at all like the ornate and enjoyable feast we first described.
This should be called web-server constipation but is actually called Head of line blocking, and it can be a major inconvenience when your salad request was unfortunately put in line after your pasta dish resulting in both arriving simultaneously and in disgusting disarray. If you’ve ever loaded a page devoid of styling, or with a broken layout, or with missing components, perhaps you have fallen victim to this dilapidated mish-mash formerly known as salad.
Of course, these nightmare scenarios are conditional on your requests resolving at all — without a hack in place the request may not be fulfilled, the kitchen closes down, and the browser is forced to deliver up a cute 404 page with cartoon monsters or a T-Rex high-jumping over cacti while you crumple over in hunger pains — no soup for you. No web page. Just this obnoxious 8-bit Dinosaur taunting you by metaphorically living out your existential struggle with this horribly broken protocol. Forever.
HTTP/2 is a protocol built on top of TCP incepted from origins inside Google under the name SPDY and endorsed by Mozilla before formal vetting, revision, trimming and approval by the IETF in 2015. After 17 drafts, 30 implementations, and 200 design issues across two years of collaboration from the foremost minds in web development, HTTP/2’s Official Spec is set to replace (with backwards compatibility) HTTP 1.1.
HTTP/2 has been built with the modern web in mind and offers many optimizations and improvements over its predecessor:
One Connection in Both Directions
HTTP/2 opens one TCP connection to the server and transmits all its requests via streams inside this same connection, improving security, performance on both the browser and network side, and most of all, eliminating the need for hacks to avoid round trips.
Requests can be Submitted Together
Requests can be chained together and submitted all at once via multiplexing which allows numerous assets to begin preparation simultaneously and further they can be returned out of order because, unlike HTTP 1.1, which transmits text and must be translated manually via laborious text parsing, HTTP/2 transmits data in the form of compact binary frames that are signed to their original request. These requests can be interleaved and prioritized to allow quicker requests to complete their round-trip while the server works on larger ones.
Requests Provide Meaningful Feedback
Also, unlike it’s predecessor, HTTP/2 provides built in mechanisms to ensure requests are fulfilled successfully and improves request reliability by outlining conditions for when a status message can be returned in response to a request. Additionally, HTTP/2 makes it possible to ping a connection without submitting any data to ensure connections are still strong.
Headers are Compressed and Optimized
One set of headers surround most typical connections with special cases for things like server push and continuation of a connection. Most of the redundancy is removed.
Lets the Server take an Active Role
HTTP/2 allows interplay between the server and the browser by allowing the server to push down additional assets in response to requests that may be relevant. This reduces round-trips, improves performance, and could potentially be leveraged to improve content personalization.
Essentially, HTTP/2 hops right off a Himalayan mountain from doing charity work with nepalese orphan puppies into a pair of perfectly cut Dungarees intent on remembering your entire extended family’s 47 dinner orders and when he forgets the extra butter for Nana he provides both a meaningful apology and a way to make it right. Best of all, the food is prepared with love and brought out as it’s ready because he communicates to the kitchen in a language they clearly understand and appreciate instead of in hand-written emoji poems.
What does this mean for me?
What real benefit can this provide? Are you just promising the benefits of some obscure future tech? How supported is this in the browser? Or around the world? And why did you just describe a grown man apologizing to my Nana while wearing overalls?
Everyone stands to benefit, but there are also a number of unknowns.
How will mobile phones handle persistent TCP connections over mobile networks? Will it become standard, despite being edited out of draft proposals, to use encryption with this technology? How considerable an SEO boost will HTTP/2 wielding websites receive?
More than 70% of the browsers in the world support HTTP/2 as of this writing and more are sure to follow as the spec continues to change the paradigm for web performance.
Chrome and Firefox have already declared they will only implement HTTP/2 through secure TLS connections and other clients are likely to follow suit making encryption widely mandatory for secure connections. This will help protect against man-in-the-middle server attacks to which 95% of current servers are currently vulnerable but does not eliminate the need for sound security measures like web-application firewalls due to other potential vulnerabilities in the protocol.
Speed, like security, has been a considerable factor in page-rank for some time so in attempting to estimate the overall SEO impact it is perhaps pertinent to observe that IBM has tested the protocol and noted reductions in load time ranging between 50% and 70%.
Over the next few years, as this protocol becomes standard, every area of marketing will see an impact:
- SEO ranking will be impacted by increased speed and potentially the increased personalization afforded by a combination of multiplexing, bidirectional TCP connections, and server push
- Developers can and should begin walking back some of their workaround optimizations in favor of the inherent improvements in HTTP/2’s multiplexed binary request protocol
- Designers may be afforded opportunities to envision and implement seamless, fluidly animated interfaces due in part to the efficiency of asset prefetch
- UX may begin weaving user stories directly into sites and applications as reduced latency and increased ability to push data to servers through open connection streams opens new avenues for personalization and content tailoring
- System architects may wish to reconsider network level optimizations intended to offload the duty of performance onto the double-edged sword of caching
- Project timelines may shift forward as many of the heavy-handed methods intended to work around current browser limits are stripped from workflows and codebases along with their associated validation and testing requirements
And Most Importantly
We don’t often consider the thread that hems the gown any more than the legs that hold our banquet table aloft — but in the end, with a mission to move in unison, from disparate parts and places, the web development community strives as artisan engineers to create truly enriching experiences on behalf of our friends and clients.
We therefore get excited about each new convergence of science and art resulting in an improvement to our methods and materials as we work to ensure our clients are able to build lasting and mutually enriching relationships with the customers and communities they serve.
The quality of the infrastructure of the web itself is not something we often consider, but as HTTP undergoes its first major change in twenty years we are prompted not only to consider how a more stable, performant, secure, and flexible web can help us empower our clients — we are incited to raise a glass in celebration.