The Case for Custom Elements: Part 2
In this post I will present some of the features that make Custom Elements compelling if you’re considering building your own component library. I’ll also touch on ideas that are often overlooked when discussing Custom Elements, such as their ability to work with Virtual DOM and be rendered on the server. Check out Part 1 if you haven’t had a chance to read it yet.
Features
So, what are the cool features that make Custom Elements better than hand rolling your own JavaScript widgets or using good old jQuery plugins? I’m going to list out my favorite features, starting from the practical and moving towards the more experimental/exotic/interesting. If the first few are old news for you, keep reading, because there’s a bunch of newer things you might not have come across before.
Lifecycle callbacks
Every Custom Element has a set of built-in lifecycle callbacks, or “reactions”, that are automatically triggered as it is parsed and inserted or removed from the document.
class AppDrawer extends HTMLElement { constructor() {
// Called when an instance of the element is created or upgraded
super(); // always call super() first in the ctor.
} connectedCallback() {
// Called every time the element is inserted into the DOM
} disconnectedCallback() {
// Called every time the element is removed from the DOM.
} attributeChangedCallback(attrName, oldVal, newVal) {
// Called when an attribute was added, removed, or updated
} adoptedCallback() {
// Called if the element has been moved into a new document
}}
Reactions give you a known system for constructing and destructing your elements. This means you don’t have to invent your own ad hoc system for booting up or destroying an element. While many frameworks/libraries also support lifecycle callbacks as a feature, the primary difference here is that these are native hooks triggered by the browser so you don’t have to ship any framework code to make the lifecycle system work. It’s a freebie.
Of particular importance is the attributeChangedCallback which will execute anytime an observed attribute is updated on your element. For example, if you have the element
<x-foo doge="frenchie">
and someone changes the value of doge to “corgi”, the callback will run and tell you the new value. This means you don’t need to write your own Mutation Observers just to watch for these changes. Really handy!
Automatic upgrade
You may be familiar with jQuery code that looks like this:
$('.carousel').carousel({ /* carousel options */ })
Or the vanilla JavaScript equivalent:
Array.from(document.querySelectorAll('.carousel'))
.forEach((element) => {
new Carousel(element);
});
This is the typical boilerplate one needs to write in order to select and bootstrap a component. On more than one occasion I’ve opened a client’s main.js file only to find that it’s one giant onDocumentReady callback full of code like this. Gross!
Custom Elements solve this issue through a process known as “upgrade”. As the HTML Parser works its way through the page, if it encounters a Custom Element, it will trigger its lifecycle callbacks. This works if the element is already in the DOM or if you insert the element via JavaScript.
const xfoo = document.createElement('x-foo');xfoo.setAttribute('doge', 'puggle');// x-foo will be automatically upgraded
document.body.appendChild(xfoo);
Again, you’re cutting down on boilerplate and any confusion around when to bootstrap your elements. If you have one of those main.js files that’s really one big callback to setup your components imagine deleting the entire thing. Sounds pretty nice, right?
Easy to style
Many developers who are familiar with Web Components or Custom Elements often think that in order to style an element you must be using Shadow DOM. Shadow DOM creates a protected style scope around your element, and while it is very cool, I want to stress that it is not a requirement for using Custom Elements. You can just as easily provide a stylesheet with your element(s), like many UI libraries do today. If you prefer to style everything with a system like BEM, or Suit CSS, or some variation in-between, that’s totally fine!
I bring this up because on many occasions I’ve spoken to developers who are leery of the amount of polyfills required to support all the Web Components standards, in particular Shadow DOM, so they avoid the related specs like Custom Elements. Don’t let the absence of Shadow DOM prevent you from using Custom Elements. You can absolutely use them today and continue to write CSS in a way that makes you feel comfortable.
Work well with ES Modules
The Polymer library has popularized the usage of HTML Imports to load Custom Element definitions. But just as we saw with Shadow DOM, this is not a requirement for working with Custom Elements. If you prefer to work with ES Modules, go for it!
Here’s an example element which can be imported into an app using ES modules:
export default class FooWidget extends HTMLElement {
constructor() {
super();
this.message = 'Hello World!';
}
connectedCallback() {
this.innerHTML = `<div>${this.message}</div>`;
}
}// Check that the element hasn't already been registered
if (!window.customElements.get('foo-widget')) {
window.customElements.define('foo-widget', FooWidget);
}
Importing the FooWidget class will register the element definition with the document. Now you can use <foo-widget> anywhere in your app!
Real instances
jQuery plugins and framework components wrap native DOM nodes to give them additional functionality. As a consumer of these components, you’re instructed not to work with the DOM directly but to instead interact with these wrappers using the library/framework specific APIs. This is why so many libraries and frameworks are not interoperable! If each library creates its own wrappers, with their own APIs, then there’s no way other libraries will be able to consume those components.
The only thing that is universal is the DOM, so it makes sense to build your elements using the DOM if a primary goal is to be able to support as many teams on as many stacks as possible.
When you define a Custom Element you’re really creating a vanilla JavaScript class that inherits from HTMLElement. Every time you use your new tag in markup, it creates an instance of this class. This means you can define your own properties, methods, getters/setters, and even reflect state changes back to your element’s attributes.
class MyElement extends HTMLElement { // Tells the element which attributes to observer for changes
// This is a feature added by Custom Elements
static get observedAttributes() {
return ['foo'];
} // Get the initial value of foo or use a fallback
connectedCallback() {
this.foo = this.getAttribute('foo') || 'Oh, hai!';
} // When an attribute is updated, check if the value is different
// If so, call our setter
attributeChangedCallback(name, oldVal, newVal) {
if (this[name] !== newVal) {
this[name] = newVal;
}
} get foo() {
return this._foo;
} // Set the value for foo, reflect to attribute, and re-render
set foo(value) {
this._foo = value;
this.setAttribute(‘foo’, value);
this.render();
} render() {
this.innerHTML = `<div>${this.foo}</div>`;
}}if (!window.customElements.get('my-element')) {
window.customElements.define('my-element', MyElement);
}
In the above example the element will upgrade and immediately check if it has its foo=”” attribute set. If not, it will use a fallback of “Oh, hai!”. Adding the foo attribute to the observedAttributes array means it will trigger the attributeChangedCallback if it is ever modified in our document.
This means you can do something like this:
var el = document.querySelector('my-element');el.foo = 'Custom Elements are awesome!';el.getAttribute('foo') // value is 'Custom Elements are awesome!'
Not only is the attribute synchronized with the property, but we can also instruct the element to render anytime the setter changes its state. Looking at that render() function, you might get the feeling that this kind of reminds you of React a bit, so let’s talk about that!
React/Virtual DOM
A lot of developers see React and Custom Elements/Web Components as being mutually exclusive, but they don’t have to be. The React documentation explains how you can use Custom Elements inside of React, or how you can put React inside of Custom Elements. It’s like inception!
One of the key ideas React has popularized is the notion that UI should be a function of state. In other words, pass in new state from the outside and have your elements re-render. A key component of this model is React’s use of Virtual DOM to compare previous and new state and only update the bits of the DOM that have actually changed.
But there’s no reason why you can’t use this same approach in your own Custom Elements using off the shelf Virtual DOM libraries. Here’s an example which uses Tim Branyen’s excellent diffhtml library. Note that diffhtml uses a tagged template string to let you write code that looks very similar to JSX.
class CurrentTime extends HTMLElement { constructor() {
super();
// Bind render to this instance so we
// can call it from within our template
this.render = this.render.bind(this);
this.render();
} render() {
// Note the diff.html tagged template string
diff.innerHTML(this, diff.html`
<button onclick="${this.render}">
Show current unix time
</button>
<span>${Date.now()}</span>
`);
}}
In the above example, even though I’m calling render() each time the button is pressed, the only DOM that is actually updated is the <span> containing the new time string. Nice!
One of the most polished examples of this approach, that I’ve seen, is the Skate library which uses Incremental DOM to do a props down, events up, approach to Custom Elements. As described by one of the authors:
Since Skate encapsulates the virtual DOM approach and uses DOM events for reactivity, it works everywhere (and with multiple versions of itself on the page). You get the nice FRP model, but it doesn’t leak outside of the component. To everyone consuming your component, it’s just a normal DOM element.
Recently the creator of Preact has also demonstrated support for Web Components in his library. And like I mentioned before, at around ~40kb (gzipped), you could even put React inside your Custom Elements and just treat it as a common dependency, which is what the Standalone library by Adam Timberlake does.
So if you’re trying to build a UI library to scale across many different teams you can absolutely use the great ideas and patterns of the React ecosystem but bundle them into Custom Elements to serve as an interoperability layer. These elements can then be used in React, Angular 2, [insert cool framework], or stand alone.
I’m only briefly touching on this topic but because there has been so much discussion on Twitter related to top down data flow in Custom Elements I’d like to do a follow up post to explore this in greater detail. In the meantime, Andre Staltz has done a great write up on the idea as it specifically relates to React. Definitely check out his post (after you finish reading mine :P )
Server-side rendering
If we’re talking about React we should also touch on the idea of Universal or Isomorphic apps. In a nutshell, a Universal app is one that is able to run its JavaScript framework on the server, bootstrap its components, and send down an initial render of the page which is then “hydrated” with additional client-side JS.
As we saw with Virtual DOM, it’s entirely possible to use this same approach with Custom Elements. The best example I’ve seen so far is a library by Tim Perry called Server Components which uses the domino library to simulate the DOM server-side and run lifecycle callbacks for any Custom Elements it finds.
Here’s an example from the Server Components project. Note the project still relies on the old v0 (that’s supposed to be a v and zero. Medium’s typeface hates me) version of Custom Elements so it uses createdCallback instead of the connectedCallback and registerElement instead of customElements.define.
const components = require("server-components");// Get the prototype for a new element
const NewElement = components.newElement();// Stamp out the element’s template
// Note this is the old v0 syntax for creating a custom element
NewElement.createdCallback = function () {
this.innerHTML = "<p>Hi there</p>";
};// Register the element
components.registerElement("my-new-element", {
prototype: NewElement
});
And to render a page:
const components = require(“server-components”);// Render the HTML, and receive a promise for the resulting
// HTML string.
components.renderPage(`
<html>
<head></head>
<body>
<my-new-element></my-new-element>
</body>
</html>
`).then(function (output) {
// Output equals:
`<html>
<head></head>
<body>
<my-new-element><p>Hi there</p></my-new-element>
</body>
</html>`
});
One of the most interesting aspects to this approach is the possibility to send down less client-side JavaScript if you can get any expensive bootstrapping work done on the server-side. I think this is a total greenfield ripe for developers to start exploring.
Progressive enhancement
Up to this point I’ve presented a fair bit of JavaScript, but the obvious question often arises: What if my JavaScript fails to load? Will my users be left with a blank page if I’m using Custom Elements?
Jeremy Keith and Adam Onishi have both recently written blog posts questioning if we’re forgetting about progressive enhancement in the rush to adopt this new shiny feature. These are valid concerns, and it’s important that as we build new elements in the browser that we not only consider the need for progressive enhancement but accessibility as well. An experience that works for as many users as possible should always be our primary goal.
To this end, I think there are a few strategies we can employ.
The most straightforward (and the most controversial) is leveraging the ability of Custom Elements to inherit from existing native tags, sometimes referred to as “customized built-ins” in the v1 spec or “type extensions” in the old v0 spec.
class FancyButton extends HTMLButtonElement {
// do setup work
}customElements.define('fancy-button', FancyButton, {
extends: 'button'
});<!-- Elsewhere in the page… -->
<!-- If JS fails to load we’ll still get a regular button here --><button is="fancy-button" disabled>Fancy button!</button>
This example would still give us our Custom Element lifecycle callbacks, while inheriting the semantics and keyboard support built-in to <button>. I mention this is controversial because, while it is part of the Custom Elements v1 spec, and Chrome and Opera plan to ship support for it, Safari has said they will not implement it.
An alternative approach to customized built-ins is to instead wrap native elements in Custom Elements.
<fancy-input-decorator>
<input type="text">
</fancy-input-decorator>
This is the approach used in the Polymer Shop app. If the JavaScript fails to load the user should still get input fields they can interact with.
But as Jeremy points out in his blog post, because the entire Shop app is contained in a single <shop-app> element, if that element is unable to bootstrap, then the user will never see those native inputs. Therefore, it’s important to be mindful of how your entire app is structured so you don’t accidentally head off your own progressive enhancement work.
I put this section after the one on Universal apps specifically because I think this is an area where server-side rendering could play an important role. Looking at frameworks like Angular 2, React, etc., it seems like the direction web app development is headed in, is one where everything is a component, including the <main-app> itself. Thus the progressive enhancement problem is not unique to Custom Elements, but really any approach that seeks to bundle functionality into higher order components (eventually reaching a top-level app component). The aforementioned frameworks can leverage server-side rendering to ensure that users at least have a workable page even if the client-side JavaScript fails (because their app component will be turned into an HTML string on the server). Those of us building Custom Elements should explore if we can take advantage of these same techniques, possibly using a library like Server Components or by building similar utilities.
TypeScript
Trying to document or export an interface for all of the API surface of each element in your library can prove to be a lot of work. While it’s possible to do this work using code comments and libraries like JSDoc, many large teams have turned to writing components in TypeScript because it has built-in support for interfaces, type checking, and provides IDEs with autocomplete support.
Since Custom Elements are just JavaScript under the hood, it’s entirely possible to write them using TypeScript. Here’s an example put together by fellow Googler Rob Wormald demonstrating just how easy it is.
Though I haven’t worked much with TypeScript, I’m really excited to see where developers take this. Not only would it be super cool to have autocompletion in my IDE when I’m working with a Custom Element, but there are potentially much larger ramifications as well. If, for instance, each element in your UI library exports a TypeScript interface, then they could be consumed by a drag and drop GUI (similar to SquareSpace or Wix) where each interface is paired with corresponding form controls (think Interface Builder, but for the web)!
Wrapping up
I’ve jumped around a lot in this post and, to an extent, that was intentional. Custom Elements are such a flexible primitive, once you start to see all of the possibilities they become really compelling!
I’m excited to see what the next few years will hold as more teams begin rolling out their own element suites and we move to an era of high quality, interoperable, UI components.
Till next time!