The Hidden Innovation Technique Behind Silicon Valley’s Creativity
Systematic Inventive Thinking is a design methodology that will help you hack the creative process and innovate like a tech titan.
To the untrained eye, technology entrepreneurs appear to possess what could only be defined as superpowers.
How else could Silicon Valley crank out so many innovative products, so quickly?
In reality, effective product development processes typically follow well-defined, easily-replicated methodologies. One of the most popular methods for formulating creative and innovative ideas is known as Systematic Inventive Thinking (SIT).
Developed in the mid-1990s as a derivation of prior design disciplines, SIT emerged as a strategy for reverse engineering a number of common ‘templates’ replicated in innovative products. The most popular of these design patterns are Subtraction, Multiplication, Division, Task Unification, and Attribute Dependency.
However, even companies with considerably more secretive product development cultures (including Silicon Valley’s most prolific tech titans) clearly exhibit features of the innovation techniques formalized in SIT.
By deconstructing how the most popular SIT templates have influenced the trajectory of the computing industry, a deeper understanding of Silicon Valley’s creative genius can be studied, generalized, and replicated.
The Subtraction template of the SIT methodology is perhaps the most easily understood. By systematically eliminating seemingly ‘essential’ elements of a product’s feature set, functionality, or interface, a more streamlined and simplified version of its predecessor may materialize— often revealing unforeseen benefits.
Apple, renowned for a minimalist design philosophy and intuitive user interfaces, could easily be considered the technology sector’s preeminent pioneer of this technique.
It’s easy to forget how dramatically the mobile landscape has changed in just a decade. When the iPhone was first released in 2007, BlackBerry was the dominant player in the smartphone ecosystem. Nicknamed “CrackBerry,” the company commanded nearly a third of mobile sales and was only months away from reaching a peak market capitalization of over $75 billion.
As a result, physical keyboards were considered to be the indisputable industry standard. The iPhone’s trademark full-screen display was a major divergence from the norm, attracting widespread criticism. In a now-infamous interview, Microsoft’s then-CEO, Steve Ballmer, openly mocked Steve Job’s creation as a “waste of money,” asserting that “it doesn’t appeal to business customers because it doesn’t have a keyboard.”
As it turns out, the subtraction of a standard keyboard was enough to unleash a mobile application revolution. The enlarged display enabled an entirely new paradigm for interacting with interfaces, allowing the touchscreen to become the new industry standard— even in the enterprise world.
Apple has since applied this subtractive formula to the sweeping majority of its product portfolio.
The removal of the iPhone’s headphone jack, while controversial, is largely credited for the staggering success of the company’s wireless AirPods headphones. The MacBook Pro’s elimination of several industry-standard interfaces— HDMI connectors, SD card slots, and even the near-universal USB port— allowed Apple to create a much thinner and sleeker laptop, while simultaneously encouraging the adoption of their Airplay and Airdrop technologies.
Apple epitomizes minimalism in computing hardware, but the software industry is overflowing with further examples of feature subtraction.
Google’s uncluttered search page.
Twitter’s condensed character limit.
Instagram’s singular focus on images.
To some, these modifications initially seemed limiting and counterintuitive. As consumers became more familiar with the mechanics of each platform, however, these innovations became shining examples of superior user experience and polished product-market fit.
The lesson here? Ruthlessly trim the fat of your products, processes, and ideas. Throw out any feature that isn’t contributing to core functionality, and experiment freely with artificial constraints. Your audience will adapt— and the results may just surprise you.
Whereas the Subtraction template focuses on the experimental elimination of features, the Multiplication template explores the impact of increasing the quantity of certain key components.
Occasionally this technique is implemented by creating several near-identical copies of the same element— most notably, Gillette increased the performance of their razors by multiplying the number of blades, kicking off a men’s care arms race. In many cases, however, practitioners of this method suggest that altering the form factor of the multiplied copies (i.e. size, shape, orientation) will produce the most intriguing results.
In a bid to prevail in the battle for ‘smart home’ dominance, Amazon seems to have aggressively embraced the Multiplication technique.
The company’s flagship smart speaker, Amazon Echo, first appeared on the scene in 2014.
As the primary interface for interacting with their proprietary voice assistant, Alexa, the device served users by setting alarms, making to-do lists, and playing music, all without touching a button.
Despite the advanced range of functionality Alexa enabled, Amazon’s smart device faced intense competition from other AI-powered virtual agents.
After all, Apple and Google dominated the smartphone market, and each company boasted a perfectly capable voice assistant of their own— embedded directly into a device consumers could easily carry to any location.
Rather than attempting to compete with Apple and Google on the competitive mobile battleground, Amazon opted to multiply the presence of the smart speaker in their customers’ homes. With a few small tweaks to the device’s shape and size, Amazon infiltrated every square inch of consumers’ lives, releasing the miniaturized Echo Dot, the inexpensive Echo Flex, and the car-mounted Echo Auto.
In order to incentivize customer buy-in for their multiplicative approach, Amazon enticed shoppers with extreme discounts, lowering Echo Dot prices to as low as 99 cents.
Today, Amazon controls 70 percent of the smart speaker market.
Multiplication innovation occurs far beyond the accelerative growth of the smart speaker market. In Artificial Intelligence, multiplying machine learning techniques to create ‘Ensemble Methods’ can improve predictions and reduce bias. The invention of Blockchain consisted of the multiplication of ledgers, scattering identical recordsets across a distributed geography in order to increase transparency and reduce fraud.
Traditionally, the Multiplication approach to innovating required access to an abundance of resources. In an age where software and information can be distributed freely over the Internet, however, Multiplication has become an accessible technique for creators of all kinds. Utilize this technique to capture a large audience, establish network effects, and scale up quickly.
There are several approaches for applying the Division methodology to innovation. You could partition a product physically by components, functionally as a process, or simply join together smaller versions of an item while continuing to preserve its core characteristics, as did Les Paul with the invention of multitrack recording.
This segmentation mechanism is easiest to visualize when it occurs on a physical object (for example, an external flash drive represents the division of a computer from its storage system).
In the software world, however, the division paradigm is becoming even more commonly applied.
By dividing applications into microservices, an increasingly popular form of modular web architecture, developers are now able to create richly-featured, highly interconnected software services— and update those services with greater speed and flexibility. Virtualized computing environments (i.e. containers and virtual machines) represent an applied division between applications, operating systems, and the underlying computing infrastructure, yielding far greater IT efficiency and productivity.
The #hashtag, first popularized by Twitter, serves as an inventive way to divide a social media feed into a collection of distinct topical conversations.
Near the beginning of the platform’s life, Twitter’s users repeatedly submitted requests for the addition of a ‘groups’ feature. Social media was in its infancy, and the most dedicated tweeters felt bad for flooding their followers’ feeds with an excess of potentially irrelevant topics and commentary.
Twitter’s creators, however, felt that existing paradigms for ‘groups’ on social media were clunky and confusing, and they refused to implement such an ineffective status quo.
Surprisingly, the solution to this problem didn’t come from within Twitter. Chris Messina, an Internet consultant, and product designer, first proposed the use of the pound (#) symbol in 2007 in order to help filter his own feed’s conversations.
Messina took major inspiration from the syntax for #channels on the Internet Relay Chat (an early communication platform used primarily for open-source projects, similar to Slack). While the symbol’s usage was popular on IRC, Twitter initially dismissed the idea, remarking that the concept was simply “too nerdy to catch on.”
After Messina convinced a few friends to use the tag in tweets about a San Diego wildfire, however, hashtags quickly went viral. In response, Twitter embraced the concept wholeheartedly, adding an automatic search feature for hashtags in 2009.
Now, hashtags have extended far beyond the Twittersphere, serving as an organizational standard on nearly every modern social media service.
Twitter’s success in popularizing an ingenious new content filtering tool — as well as the growth of subdivided services in software at large — illustrates just how powerful the Division technique can be as an innovation template. Those looking to leverage this approach to creativity should closely examine how their own ideas can be repackaged, restructured, and reorganized.
Task Unification is the art of linking together 2 or more features, unlocking dual functionality with a single component or task.
Essentially, it’s the product development equivalent of killing two birds with one stone.
This could involve joining together relatively unrelated objects (i.e. wrapping an automobile to create a moving billboard) or aggregating similar items (i.e. combining several common tools to form a Swiss army knife).
One of the most interesting aspects of Task Unification in the tech industry is that the resulting functionality tends to benefit multiple parties.
Even casual users of the internet have undoubtedly encountered the anti-spam tool known as reCAPTCHA. Acquired by Google in 2009, this plugin serves as a way to distinguish human users of the internet from web-crawling robots (CAPTCHA stands for “Completely Automated Public Turing test to tell Computers and Humans Apart”). The system is used on thousands of webpages as a filtering mechanism for preventing automated fraud, unwelcome text-scraping, and a variety of other undesirable browsing behaviors.
As the universal internet-gatekeeping standard, reCAPTCHA tended to generate quite a few clicks during its spam-filtering duties. Recognizing this as an opportunity to harness the brainpower of users, the plugin’s creators began integrating small problem-solving tasks into their CAPTCHA tests as a means for training their own Artificial Intelligence models.
Initially, these tests came in the form of distorted phrases and short bodies of text, which users transcribed in a text box. On the front end, this simple task helped verify human users in less than 10 seconds — but on the backend, Google used this data in their mission to digitize every book in existence.
To date, millions of books and articles have been collectively transcribed by reCAPTCHA users.
Unfortunately, as Natural Language Processing algorithms grew increasingly sophisticated, it became clear this approach was becoming less effective in spam-filtering efforts. In response, Google substituted text transcription with another task: identifying everyday objects from a pool of random images. Again, this was a simple undertaking for human users— for Google, however, it’s clearly a strategy for training visual recognition algorithms in self-driving cars.
The pattern of utilizing Task Unification to aggregate resources at scale appears to be an extremely popular and effective strategy.
The online puzzle game Foldit isn’t just fun for users— it also serves as a crowdsourcing tool for researchers studying protein structures, spurring biological innovation and the eradication of diseases. Cryptocurrency ‘mining’ systems are designed to process transactions on a distributed basis, while also requiring participants to generate complex, energy-intensive mathematical proofs in order to enhance network security.
The possibilities for applying Task Unification in your own thinking are endless. By carefully considering the intended purpose of each component in a product or step in a process, surprising synergies can be discovered. Be sure to apply systems thinking when looking at the big picture of any ecosystem— underutilized resources are often lurking just beneath the surface.
Simply put, Attribute Dependency is an innovation technique used to form a cause-and-effect relationship between 2 or more features/processes.
A popular example of this approach to innovation would be the invention of transition glasses. By incorporating a few unique materials, transition lenses are able to adjust tint levels automatically, shifting darker as the surrounding environment grows brighter.
Traditionally, Attribute Dependency relationships have been intrinsically restrained to just a few variables and components. Consumer-facing products, in particular, are bounded by limitations to feature complexity— you can only link together so many variables before their relationships become too entangled and convoluted for users to understand.
In the software world, however, the advent of Machine Learning has supercharged the potential of Attribute Dependency, leveraging AI to expand the scope of interdependent feature relationships by several orders of magnitude.
AI-powered recommendation engines (including the content curation algorithms employed by Spotify, YouTube, and Netflix) constantly monitor the preferences of their users, profiling each person and documenting their content choices as a list of unique attributes. As data is gathered, trends and patterns in their preferences are revealed, allowing Netflix to more intelligently recommend new shows and films depending on users’ similarity to other viewers.
This approach has been extraordinarily effective, with 80 percent of Netflix’s content views originating with an algorithmic recommendation.
Not only do these attributes automatically influence which pieces of content Netflix’s algorithm will recommend— the company’s producers also use this map of relationships to determine what kind of content to produce next.
In other words, Netflix’s content creation plans depend directly on the choices their users make, applying Attribute Dependency at even a strategic level.
For example, the studio’s first breakout series, House of Cards, was developed based on an analysis of user attributes and preferences. After poring over millions of data points, Netflix discovered an overlap in 3 unique user groups: those who watched the original UK version of House of Cards, those who were fans of movies starring Kevin Spacey, and those who enjoyed the works of American filmmaker David Fincher.
Armed with the knowledge that these 3 attributes were shared by a large pool of users, Netflix promptly committed $100 million to the creation of an American adaptation of the series, starring Spacey and produced by Fincher.
In legacy Hollywood circles, budgeting such an extraordinary sum of money for a show with no pilot would have been far too large of a risk to stomach— but Netflix felt confident in their Attribute Dependency-powered strategy.
House of Cards went on to become a massive hit.
One survey shows that, following the show’s release, 86 percent of Netflix subscribers stated they were less likely to cancel after watching House of Cards.
Attribute Dependency is an incredibly powerful tool for any innovator. Through the development of intuitive, self-adjusting features and processes, users will feel as if their experiences are powered by magic. What’s more, AI and Machine Learning have greatly enhanced the potential for harnessing even deeper insights into user attributes, allowing creators and consumers to form far more intimate relationships than ever before.
The SIT methodology is, without a doubt, one of the most accessible approaches ever devised for sparking creativity and inspiring innovation.
Having been applied by a diversity of creators, from rockstar musicians to Fortune 500 companies, this technique has consistently provided a flexible platform for generating game-changing ideas and inventions.
Whether you’re looking to brainstorm new product ideas, seeking inspiration for an article, or simply experimenting with creativity in general, SIT’s success in Silicon Valley illustrates just how powerful of a tool it can be for innovators everywhere.
What will you reinvent?
Can you think of any other examples of how SIT has been applied in the world of technology? Let me know in the comments below.
Artificial Intelligence Will Catalyze a Remote Work Revolution
More and more Americans are working from home— unfortunately, this presents a number of social and technical…