The Meaning of Interoperability
Interoperability means working together… in more ways than one.
Interoperability is a feature that prevents vendor (and data) lock-in, and facilitates the connection of a diverse group of people, networks, applications, data, and systems. In our products and services, it’s an important and in-demand component, and is even a required feature for a lot of public and private sector software. When we talk about interoperability on the web, we’re generally referring to the technical compatibility of features across different browsers, also known as “browser interop.”
Nicos L. Tsilas identified four methods organizations can use to accomplish technical interoperability.¹ One, they can explicitly design their products or services to be interoperable “out of the box,” limiting or eliminating entirely the need for customization or integration (e.g. Adobe Creative Suite Products). Two, they can work with their partners, customers, competitors, and community members to develop interoperable products or services in the marketplace (e.g. Slack and Salesforce). Three, they can use licensing strategies for patents, copyrights, trademarks, and other IP rights to provide or gain access to technology (e.g. Unreal Engine). Finally, they can develop and implement industry standards and incorporate them into their products or services (e.g. Intel & USB).
An organization might use one or all of these approaches to accomplish technical interoperability depending on its offerings and business model. Accomplishing technical interoperability for the web platform, however, presents an entirely different challenge. Presenting at BlinkOn in 2016, Google Sr. Staff Software Engineer Rick Byers shared his thoughts on what makes this so difficult: “To me, a big part of the problem is the mismatch between how developers think of the web platform (a single product with multiple configurations) and how browser vendors tend to think of it (my browser is the product).”²
The statement highlights two key issues affecting the long-term success of web interoperability: 1) a differing perspective of the web platform “product,” and 2) human interoperability issues stemming from the fact that no individual or organization can do all the interop work themselves. Alex Russell has written extensively about how the “Browser-Developer” dynamic plays out from a technical product development standpoint, and those who are interested to study how this ultimately impacts new features and standards development efforts would do well to follow his blog.
We are lucky to have well-qualified people in our industry with the passion, skills and experience to tackle technical interoperability issues, but we need to think of the web platform as a resource everyone is responsible for owning and maintaining. If we’re going to get the rest of the way (and stay there), it’s time to start focusing on the more difficult people interoperability issues.
Achieving web platform interoperability is hard work that involves more than just agreement on the technical details.
People interoperability refers to less tangible, and often very complex issues of organization, structure, governance, semantics, and policy. We need organizational interoperability to ensure effective collaboration across all kinds of institutions that need to share information but have different structures and processes. We need semantic interoperability to ensure that the information we exchange is understandable, and that we can build from a shared language and vocabulary. We need policy interoperability to ensure that what we share is accurate, reliable, and meaningful. Each of these facets is also influenced by cultural, political, economic, and environmental factors, and should be examined through those lenses.
There are a few efforts around the web that I think are addressing interoperability from both sides. The Web Platform Tests project, started in March 2013 by James Graham of Mozilla and Philippe Le Hegaret of the W3C, is perhaps the oldest and most far-reaching of these. WPT has grown from a wide-eyed attempt to create a single comprehensive test suite for the entire web platform into a method-driven, shared, cross-vendor test-suite that checks regression and spec-conformance of almost every single HTML, CSS, DOM, and browser API. The project was incubated by the W3C but includes contributions from every major browser vendor, as well as organizations like WHATWG, Node Foundation, and Bocoup.
These efforts have been largely focused on browser interoperability, but it’s only going to get more complicated and we need to be prepared. As new implementers come on board, they will try to match the behavior of existing, popular implementations — where two (or more) don’t behave the same way, we have an opportunity to improve interoperability. Tools like WPT and Test262 will help, but we need to make sure that as new features and use cases are created, they are easily upstreamed and consumed by all implementations. Further, the modern web developer community is moving on to other concerns, like performance, security, or even just interoperability problems in their open source software tools and packages.
“Developing for the web is hard. The coordination problem makes this harder. You can roll a boulder uphill and avoid all of the death traps in your path, but when you’re at the top there’s no way to lock in your successes.” — Philip Jägenstedt, Software Engineer, Google²
Progress is hard won, and interoperability is easy ground to lose. The issue isn’t in defining a technical workflow — that’s something we have the know-how to do. The issue is in defining a workflow that tackles interoperability problems from a technical, organizational, semantic, and policy perspective on an ongoing basis. To lock in our wins, we need to invest time and resources in the resolving issues of People Interoperability — a difficult task, to be sure, but I have some ideas for where to start. First, we need to get our industry groups and standards bodies talking more regularly about ideas and strategy; a joint interoperability committee with participants from WHATWG, W3C Ecma, JS Foundation, Node Foundation and others would be a good start. Second, we need to increase our project management investment in open source and standards work. Often, the burden of meeting and governance facilitation falls on the shoulders of those who are also carrying a lot of specification workload, and additional helping hands can make sure non-technical issues aren’t accidentally dropped. Third, we need to work on structural issues that prevent all parties from being equal participants — this means addressing organizational membership issues as well as issues that limit participation from underrepresented groups. To solve interoperability issues completely, we need to lean less on our terminals and more on each other.
The state of interoperability on the web platform is a reflection of us.
The real meaning of interoperability is that all concerns have been put on the table and we’re resolving them together. There are simply too many stakeholders — and too much at stake — for any one entity to “claim territory” over — or bear the burden of responsibility for — the entire platform. Technical interoperability has been a bridge to better collaboration, but to make the web platform truly interoperable we need to engage everyone in a conversation about how we can better collaborate to support this platform, and keep it open for all. Interoperability on the web requires organizations to talk with each other and work together on joint strategies and ideas. This is the only way we can reinforce areas where we agree, so we stay more aligned when we don’t.
¹ Open Innovation and Interoperability in Opening Standards: the Global Politics of Interoperability (2011) MIT Press.
² BlinkOn 2016 Day 1 Talk 5, Web Platform Predictability