Define Graceful Degradation & Progressive Enhancement:
It’s not rocket science, it’s barely computer science. This article aims to breakdown the definitions of common web development methodologies with as little sugar coating as possible.
First let’s make sure we’re on the same page:
Progressive enhancement & graceful degradation are methodologies not frameworks, they are simple un-opinionated concepts without instructions. Their underlying definitions are approximate across a number of different industries and paradigms outside of web development.
Great so we want to do two things, decipher common methodologies purely for what they are and to raise awareness of some common misinterpretations. (See any mistake? Have an opinion? leave a comment)
In the context of the web, the progressive enhancement methodology is a guiding principle with the aim of making web pages accessible to a variety of browsers, by developing from a baseline of compatible features, and then layering enhanced features throughout progression to more capable browsers.
Progressive enhancement is self-explanatory, thus revealing its ambiguity. The terminology has been exercised in science & medical industries prior to web development. Although, it may have derived from progressive refinement, a methodology associated with 3d rendering.
Many explanations & opinions circulate regarding progressive enhancement in the context of the web. It is often made out to be convoluted though, it is not a framework, but a hypothesis. Seeking structure within such ambiguity can be similar to seeking guidelines for i.e., “healthy eating”.
In the context of the web, the graceful degradation methodology is a guiding principle with the aim of making web pages accessible to a variety of browsers, by developing from a baseline of full features, and then removing layers of features throughout the regression to less capable browsers.
Progressive enhancement and graceful degradation tackle the exact same problems with the exact same goals in mind. Graceful degradation is the reverse approach of progressive enhancement, although they contain ambiguity, they are self-defining principals in the context of the web.
There may be some minor confusions, as graceful degradation has several meanings within common computing paradigms. The most notable regards fail tolerance where a system/ network is able to operate at reduced performance in the event of failing dependencies. On the web, graceful degradation is associated with the compatibility of features, but on a mainframe or network it is linked to the health of “a” system (similar within electrical engineering).
Disclaimer: I'm calling this “departure”, to the best of my knowledge there is no “wide-spread” used terminology for browser exclusion. (Rename it, say it’s silly, I don’t care but please don’t mix this thing up with graceful degradation ever).
“Departure” regards the departure of a web development project’s support from a browser brand or browser version. It is absolute exclusion.
Many designers, developers & architects incorrectly refereed to “graceful degradation” as exclusion. It is NOT!
As you can not degrade support for something that has been excluded. Sometimes the intent of departure is to reduce technical debt, to become more agile & or to become a responsible player for the future of the web (e.g., Google & MS are quite wise with this). Although it is usually undesirable if a significant margin of users will be negatively affected.
Departure should require thoughtful decision making, but in theory it is simply short-listing and creating a threshold for web support.
In the context of the web, the regressive enhancement methodology is a guiding principle with the aim of making web pages accessible to a variety of browsers, by developing from a baseline of full features, and then replicating those full features throughout the regression to less capable browsers.
Regressive enhancement is common, more so than progressive enhancement. Both methods are usually utilized in conjunction, even when developers are unaware. jQuery is a fine example of RE in production.
It is crucial to comprehend how regressive enhancement is used culturally in modern web development. RE is commonly ( /religiously) used as an abstraction layer on top of progressive enhancement. Libraries such as AngularJS, jQuery and lodash regressively enhance legacy browsers with features (polyfills) they are natively incapable of.
Every modern web developer, is using a combination of progressive enhancement, graceful degradation, regressive enhancement and departure whether they know it or not.
There is no “one size fits all” solution, in modular web development usually all will be utilized, despite not being aware.
Progressive enhancement has been the heart of web development for several years before it became a “viral” buzz word. Even though many may not have be aware.
…don’t you mean, Regressive Enhancement?
The majority of mainstream and business oriented web projects at some point start with the question, “Who are my audience”. From a business POV, the focus is usually on inclusion. As a fundamental guiding principle, this is usually PE or GD, but rarely RE. Developers rarely use RE as a “core principle” (Is it necessary to use a young framework and create custom polyfills for legacy browsers? …could you “effectively” make a WebGL banner run in IE8? Is it feasible to get SVG interfaces working for Android 2.3 users?).
As you can see regressive enhancement (as a core principle) can become expensive. The obvious solution from a technical standpoint is to utilize technologies that (mostly if not totally), work for the anticipated user base, that in a nutshell (usually 95% of the time in my opinion) is executed via progressive enhancement layered with regressive enhancement, mostly out of convention and convenience.
Using the Earth core cross section figure above conceptually, we are using a combination of methods as abstraction layers. jQuery maybe your RE, Polymer may be a benefit of the departure of older browsers and CSS3 falling back to CSS2.1 may be your GD.
So GD and PE result in the same thing right?
Although in theory PE and GD are different workflows that should share approximate outcomes, it’s just not possible within web development. We are not managing failure, and even though you may prefer to call it “failure of support” it’s not for one system, one building or a series of lab tests. We are trying thousands of APIs for hundreds of browser versions across hundreds of commonly used devices. Below should fill in the gaps…
- Much of the software and tools we depend on for development revolve around regressive enhancement. This is not too favorable with GD but goes hand in hand with PE.
- RE makes up for the lack of regressive prioritization with PE as a core principle, meaning RE favours the PE workflow despite if a development team is “doing progressive enhancement right” or not. You could argue the same for GD but they are both regressive approaches, within good practice they will at times create a conflict of interest. (I hope this makes sense, it should be simple)
But regressive enhancement isn't a problem, it’s a great solution that has provided consistency throughout the web (praise to John Resig).
The most desirable approach in theory should be GD because it is not hard-wired into our RE-API culture, therefore it is easier for RE to become more of an optional abstraction layer for GD at times when it is mandatory for PE.
Progressive enhancement has been necessary and effective for several years, but as we approach evergreen browsers, legacy browsers slowly die out and the opportunity to depart from PE “as a core principle” should be immediately jumped at IMHO because we want to shape web development to evade legacy debt (It is an expense).
I think it’s wise to impartially observe the “change in climate” upon the web development & browser landscape. If, you’re not one of those nuts who denies “climate” change ;-)