The Innovator’s Dilemma: Part I

It is easy to forget that thousands of successful companies exist without a rock star CEO. In this age of celebrity, we are accustom to reading about how Steve Jobs was able to make Apple great with his driven personality. Elon Musk makes news with his remarkable vision for the future. But re-reading The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail by Clayton Christensen (Harvard Business School Press, 1997), I was reminded that most great companies succeed based on well honed processes. As the title of this 20 year old book suggests, these processes do fail when confronted with “disruptive innovations”. The book details how this happens by reviewing what excellent companies do to dominate their industry and then demonstrates how this strength becomes a weakness when faced with disruptive technology changes.

Mr. Christensen uses the computer hard drive industry to help outline his theory. That market has seen at least 5 disruptive transitions between 1975 and 1995. This rapid turn over in technology makes it the business equivalent of fruit flies because it spawned multiple generations of companies in a short period of time. He shows how the size of hard drives represented a disruptive change. As drives moved from 14-inch diameter to 8-inches, incumbent market players were not able to maintain their dominance. The patterned repeate itself as hard drives shrunk to 5.25-inches then 3.5-inches and finally 2.5-inches in diameter.

Dominant companies develop processes which quickly identify the most desirable features or characteristics of their product offering based on customer feedback. The disk drive industry, for example, offered products that competed on features like:

  • Storage density, how much information can be written to disk per square inch
  • Read/Write speed, how quickly can that information be stored and retrieved from the disk
  • Physical size of the hard drive, how much space does it take up

The birth of the hard drive industry occurred approximately in the mid-1960’s. The market for computing products was large banks and insurance companies who had to keep records and perform large but regular computations. IBM, Control Data, Burroughs and other large firms met this market demand by offering main frame computers that occupied entire rooms of the clients’ office. The size of the hard drive was the least important parameter. The first hard drives developed by IBM, for example, could store 5 MB of information on fifty 24-inch disks and each was the size of a refrigerator. The successful companies in this space were able to improve on these specifications by packing more data per square inch of physical disk space and managing faster data access. Size reduction may have resulted but it was not the primary factor driving enhancements. It is also important to note the customer demand drove continued, incremental improvements in these attributes year-over-year.

As the market grew, these incumbent players became “captured” by their customers. They developed processes to evaluate new technology and architectural changes that focused on the features customers demanded. If an innovation came along that improved performance along the feature set, successful companies had managers that were able to identify and act on it. Further, upper management provided full funding and resource support for these efforts. Christensen defines these types of innovations as “sustaining” innovations. In other words, the new technology pushed product development in a direction the customer wanted to go.

Technolocial developments in data storage density allowed the author to describe his theory that sustaining technologies do not cause businesses to fail. In the process, he counters the argument, put forth by many, that technology companies commonly fail in the face of a new technology that either develops to quickly for them to cope or creates a difficult transition because it lies outside the engineers’ current core compentency.

The amount of data that can be written to a disk is largely determined by the read/write head. Initially, heads were made of a ferrite-oxide. Between 1976 and 1989, engineers were able to improve the performance of ferrite-oxide heads through various techniques. The improvements pushed data per square inch from 1 Mbpsi up to 20 Mbpsi over that time. This progress satisfied the customers’ desire for increased data storage, but ferrite-oxide heads reached their performance limit by the end of this period.

Thin-film heads were developed around 1983 and introduced to market in 1985. They represented a radical technolgical change in the industry. But thin-film technology pushed performance along the data-density curve. It also provided for the continued improvement that ferrite-oxide heads were clearly not able to meet. Established market leaders were able to invest in development, design new hard drive models and maintain their dominance. Very few new entrants to the market were able to survive even if they adopted the new thin-film head first.

So, the question remains, what is a “disruptive” technology and why do large successful firm seem to have a blind spot for them that causes them to fail? I will take that up in my next post.