Gameplay for Data Flow

James Urquhart
Digital Anatomy
Published in
8 min readMar 24, 2017

Once we’ve done the work to understand our value chain, map it to evolution, and evaluate how context and doctrine bring clarity, it is time to decide what actions we should take (if any). The great thing about having a map is that, in additions to patterns of doctrine and context, there are known patterns of gameplay that we can evaluate.

In one of his longer chapters, Simon outlines where to go from a basic map, so let’s apply that approach to our map below.

I’ve added some arrows to the areas where movement is likely, in part to show where we can attack, and in part to demonstrate the relative value to bring clarity to why we would choose one option over the other.

Before we step through each option, however, let’s review a little about what types of gameplay can be found in business competition. Again, Simon gives us some guidance in his post, and provides the following invaluable table:

(The colors were added by Simon, and can be ignored here. I will replace this with a non-highlighted version if I can get one from Simon.)

All Simon is doing in this table is categorizing various forms of context-specific gameplay. Some forms of gameplay are around how one goes to market, for instance, or how one jettisons assets that will drag down a logical strategy. Others are about attacking the competition head on, occasionally in ways that might stretch one’s sense of ethics.

The point is not that all of these options should be pursued, but that these are all options that might be pursued in any given context. Which options to evaluate, and which specific tactical forms they may take, are completely up to those responsible for action.

So, let’s step through each of the four I highlighted above, and discuss some of the gameplay options available.

Streaming Protocols

Right now, the world of data protocols is in a very bubbly state. While the formats used to establish protocols are widely accepted at this point (e.g. REST, JSON, YAML, etc), the actual context-specific protocols are still largely custom developed on a case-by-case basis.

There are exceptions, of course. EDI document formats, for example. Or MQTT (and, unfortunately, a host of other data formats) for IoT applications. Even HTTP for communications. However, for any specific need, developers are still finding themselves building custom protocol elements, even when using these “standards”.

This opens the door for individuals and organizations (non-profit or otherwise) to begin building, promoting, and curating protocols for both general and context specific applications. Common data sets, such as browser performance data or credit card transaction data, will likely see standard protocols for capturing, formatting, communicating and acknowledging their relevant data.

Some relevant game play scenarios:

  1. Buyer/Supplier Power — A company with a significant enough domination of their market may be able to pressure their dependents to adopt or even pay for a specific protocol. The question here, however, is the extent to which that play would be vulnerable to someone using open technology as a counter measure.
  2. Open approaches — A smaller player, or a group of smaller players, can relatively easily overwhelm a single large player by banding together and identifying common open standards to use between them. The key to being open here is not so much the marketing value to those that adopt the standard, but the opportunity for an ecosystem to grow around the standard that greatly outpaces any created for the large player alone. In other words, by accelerating the evolution of the open standard to meet the needs of several consumers, that standard can more quickly find market alignment than a protocol wholly owned by a single player.

Of course, this balance of open to proprietary applies to other “standardization” opportunities on our map, such as real time data and functions.

Real Time Data

One of the most exciting opportunities presented by new data flow architectures is creating standard sources and formats for real time data that has little differentiated value in and of itself. By decoupling the data from the applications that consume that data, data capture and transfer mechanisms can be defined, scaled and evolved as needed without breaking those applications (provided interfaces and formats are well managed and consistent).

A great example of this is browser performance data. Imagine if all standard browser interaction data were sent to a common capture mechanism where anyone who wanted to could subscribe to all or some of the data as it was received. Since the data format doesn’t belong to the consumers of the data, and the capture mechanism will likely be a standard set of mechanisms (e.g. log-based queues and “IoT-like” gateways), there is no unique value in how this data is captured and distributed. The real value comes from what people do with it once it is captured.

So, what other data sets are there that could be made less expensive and more valuable at the same time through economies of scale? Traffic data? Health data (at least at the aggregate level)? Government actions (e.g. a “Federal Register as Data Flow”)?

Here are some plays one might use to accelerate this trend, and perhaps take advantage of it directly:

  1. Market enablement — An existing player in a given data space could easily jumpstart this by creating a foundation or other legal mechanism where they could donate their data capture mechanisms and open them to others (perhaps for a small fee to cover expenses, or by funding via “sponsor” memberships). This would give them the advantage of having the most mature use of the data flow (and perhaps valuable processing, analytics and visualizations) while creating an ecosystem that can point the way to other possible uses.
    Successful uses can then be harvested as needed to extend and expand the donor’s products into new markets and revenue streams.
  2. Coopting and intercession — Successful data streams in areas like EDI, financial trading, and even news reporting already have a number of companies providing capabilities that add value to those streams. Supporting an existing stream and its ecosystem might just make that ecosystem yours, as well. That, in turn, would jumpstart the utility of your own product or service.

The best example that I know of cooption is actually a case where its *lack* possibly meant a huge missed opportunity. When OpenStack was first getting its legs (back in the “B”, “C”, and “D” releases), there was a huge debate about what API approach should be taken. One side argued that the original Rackspace APIs should be the default focus, in order to “differentiate” OpenStack.

The counter argument, however, was that OpenStack should mimic AWS in any way it can (via common APIs and behaviors), so that OS could coopt Amazon’s already vast ecosystem. This would create a ready-made market for people looking to reuse tools and apps from the Amazon ecosystem in their own data centers.

The “differentiated” API won the day. Why was this potentially a missed opportunity? Because with the AWS APIs, there would be an ecosystem of tools and applications that *consumed* OpenStack. That ecosystem would create demand. With the OS API, this ecosystem never developed on its own. (There is much more to this, but perhaps another time.)

Functions/Code

Just streaming data is almost worthless. For data to be turned into actionable information, it must be analyzed. To be analyzed, it often must be manipulated (e.g. aggegated, combined with other data, reformatted, etc.). To be actionable, the analysis often must be visualized for human consumption, or formatted and transmitted as some sort of event signal for automation.

The advent of function services — namely AWS Lambda, Microsoft Azure Functions and Google Cloud Functions, but also including Kafka and NATS — means there is finally a mechanism that is at the right granularity to package and deliver these forms of “glue” code. While we are in the early infancy of what function services can do today, and there are huge gaps in the tool chains that will be required to operate in functions at scale, there are already subtle signs of a function market forming in the future.

If this is so, what are some of the game play elements that might be interesting for functions and code?

  1. Land grab/first mover — It is possible to attack this space by attempting to build the definitive market of function components before anyone else can do so. To be honest, I only really think this play is available to Amazon, Microsoft and Google for general data processing, but it is possible that an independent player could prove me wrong when it comes to a specific industry vertical or data science category. The key here would be to gain network effects around both consumption and creation of functions faster than anyone else in the market. A huge challenge, but potentially profitable if one pulls it off.
  2. Standards game — The alternative might be to work hard with others to define a declared or de-facto standard set of functions for general or context specific use. Think of a “linux command line tools for function services”. In Linux, one can do a tremendous amount of valuable work using the standard tools in the operating system arsenal. Imagine a similar set of adopted tools for data transformation, aggregation, etc in the major functional services.

I think there will be some financial winners in this space, but it is hard for me to predict exactly who and how. That is to be expected, of course (mapping only shows the where and why, but not the who and how). Somewhere right now is a true pioneer-type engineer cooking something up that will stun and amaze the technology world.

New Value Opportunities

The really exciting possibility with respect to data flow, however, is what can be created that was impossible or prohibitively expensive to create in the past. New ways to communicating, finding value, balancing markets, optimizing activities, and so on, are now awaiting discovery. The scale available out of the box is insane, and the combination with other technologies and sciences will likely open new doors — and challenges — for human advancement.

I’m not going to list individual gameplay options here. I’ll leave that up to you, the reader, to consider for the problems and potential solutions you are most interested in. However, I will say that the players that can “see the board” here, for their respective user needs, will likely outplay and out compete those that can’t.

Map you areas of interest. Present those maps to friends and colleagues and debate them. Iterate through purpose, landscape, climate, and doctrine, and learn from each pass. Run various gameplay scenarios against your map, and against you analysis of your markets and competition.

I will certainly be doing so.

As always, feel free to provide feedback or ask questions in the response section below, or on Twitter where I am @jamesurquhart.

--

--