Image for post
Image for post

IBM’s Greatest Challenge?

The end game is Cloud

Prof Bill Buchanan OBE
Oct 12 · 11 min read

IBM has survived for over 100 years (in fact over 107 years). As many companies have risen and fallen in each of the key eras of technology, IBM has sustained its lead. Who remembers that Plessey, Fairchild, RCA, Rockwell, Burroughs, DEC, and Sperry, all were leaders in technology at one time? In the 1970s, IBM was in the Top 5 companies in the world, and they sustained their Top 10 position until the end of the millennium. By 2020, the top companies in the world are now commonly tech companies (Amazon, Google, Facebook, AT&T and Intel) and where IBM has slipped to 38th place in the Fortune 500:

Image for post
Image for post

IBM has managed to innovate around mainframes and then onto the PC, but now they know the future … the Cloud. For them, most other things … things related to services, software and systems running on-premise are now seen as a legacy. With the ever-increasing rise of the public cloud, IBM sees the threats of Amazon AWS and Microsoft Azure, as some of the most significant threats that it has ever seen to its core business … looking after its customers.

Personally, I see no other way, but to move to the Cloud. Overall Sun Microsystems were right when the said, ‘The network is the computer’. Within the public cloud, there is a whole new toolbox for building dynamic systems, and in a way that was never thought possible.

IBM will now split its company into two: IT infrastructure services unit (90,000 employees with over 4,000 clients across the world); and its main Cloud/AI unit. For IBM, their whole history has been to divest. Most recently they moved out of the PC, moved out of semiconductors, and then the low-level server market. The renewed focus on the Cloud builds on IBM’s $34 billion acquisition of Red Hat and shows the move towards cloud-based systems.

Will it work? Well, IBM are quite a way back from the leaders/visionaries of AWS, Microsoft, and Google, and even behind Oracle in building cloud infrastructures:

Image for post
Image for post

The History of IBM

We have seen five main waves in computer technology: Mechanical computers; Transistorised computers; Microprocessor-based Computers; Networked; and Cloud.

One of the first occurrences of computer technology occurred in the USA in the 1880s. It was due to the American Constitution demanding that a survey is undertaken every 10 years. As the population in the USA increased, it took an increasing amount of time to produce the statistics. By the 1880s, it looked likely that the 1880 survey would not be complete until 1890. To overcome this, Herman Hollerith (who worked for the Government) devised a machine which accepted punch cards with information on them. These cards allowed a current to pass through a hole when there was a hole present.

Hollerith’s electromechanical machine was extremely successful and used in the 1890 and 1900 Censuses. He even founded the company that would later become International Business Machines (IBM): CTR (Computer Tabulating Recording). Unfortunately, Hollerith’s business fell into financial difficulties and was saved by a young salesman at CTR, named Tom Watson, who recognized the potential of selling punch card-based calculating machines to American business. He eventually took over the company Watson, and, in the 1920s, he renamed it International Business Machines Corporation (IBM). After this, electromechanical machines were speeded up and improved. Electromechanical computers would soon lead to electronic computers, using valves.

Image for post
Image for post

Figure: Punch cards

After the creation of ENIAC, progress was fast in the computer industry and, by 1948, small electronic computers were being produced in quantity within five years (2000 were in use), in 1961 it was 10000, 1970 100000. IBM, at the time, had a considerable share of the computer market, so much so that a complaint was filed against them alleging monopolistic practices in its computer business, in violation of the Sherman Act. By January 1954, the US District Court made a final judgment on the complaint against IBM. For this, a ‘consent decree’ was then signed by IBM, which placed limitations on how IBM conducts business with respect to ‘electronic data processing machines’.

Image for post
Image for post
Figure: ENIAC

In 1954, the IBM 650 was built and was considered the workhorse of the industry at the time (which sold about 1000 machines and used valves). In November 1956, IBM showed how innovative they were by developing the first hard disk, the RAMAC 305. It was towering by today’s standards, with 50 two-foot diameter platters, giving a total capacity of 5MB. Around the same time, the Massachusetts Institute of Technology produced the first transistorised computer: the TX-O (Transistorized Experimental computer). Seeing the potential of the transistor, IBM quickly switched from valves to transistors and, in 1959, they produced the first commercial transistorised computer. This was the IBM 7090/7094 series, and it dominated the computer market for years.

Image for post
Image for post
Figure: RAMAC (the person is pointing at a 5MB hard disk)

In 1959, IBM built the first commercial transistorised computer named the IBM 7090/7094 series, which dominated the computer market for many years. In 1960, in New York, IBM went on to develop the first automatic mass-production facility for transistors. In 1963, the Digital Equipment Company (DEC) sold their first minicomputer, to Atomic Energy of Canada. DEC would become the main competitor to IBM, but eventually, fail as they dismissed the growth in the personal computer market.

Image for post
Image for post
Figure: DEC VAX 11

The second generation of computers started in 1961 when the great innovator, Fairchild Semiconductor, released the first commercial integrated circuit. In the next two years, significant advances were made in the interfaces to computer systems. The first was by Teletype who produced the Model 33 keyboard and punched-tape terminal. It was a classic design and was on many of the available systems. The other advance was by Douglas Engelbart who received a patent for the mouse-pointing device for computers. The production of transistors increased, and each year brought a significant decrease in their size.

Image for post
Image for post
Figure: The first mouse

The third generation of computers started in 1965 with the use of integrated circuits rather than discrete transistors. IBM again was innovative and created the System/360 main-frame. In the course of history, it was a true classic computer. Then, in 1970, IBM introduced the System/370, which included semiconductor memories. All of the computers were very expensive (approx. $1,000,000), and were the great computing workhorses of the time. Unfortunately, they were extremely expensive to purchase and maintain. Most companies had to lease their computer systems, as they could not afford to purchase them.

Image for post
Image for post
Figure: IBM System/360

As IBM happily clung to their mainframe market, several new companies were working away to erode their share. DEC would be the first, with their minicomputer, but it would be the PC companies of the future who would finally overtake them. The beginning of their loss of market share can be traced to the development of the microprocessor, and to one company: Intel. In 1967, though, IBM again showed their leadership in the computer industry by developing the first floppy disk. The growing electronics industry started to entice new companies to specialize in key areas, such as International Research who applied for a patent for a method of constructing double-sided magnetic tape utilizing a Mumetal foil interlayer.

The beginning of the slide for IBM occurred in 1968 when Robert Noyce and Gordon Moore left Fairchild Semiconductors and met up with Andy Grove to found Intel Corporation. To raise the required finance they went to a venture capitalist named Arthur Rock. He quickly found the required start-up finance, as Robert Noyce was well known for being the person who first put more than one transistor of a piece of silicon. At the same time, IBM scientist John Cocke and others completed a prototype scientific computer called the ACS, which used some RISC (Reduced Instruction Set Computer) concepts. Unfortunately, the project was cancelled because it was not compatible with IBM’s System/360 computers.

Image for post
Image for post
Figure: First integrated circuit

In 1969, Hewlett-Packard branched into the world of digital electronics with the world’s first desktop scientific calculator: the HP 9100A. At the time, the electronics industry was producing cheap pocket calculators, which led to the development of affordable computers, when the Japanese company Busicom commissioned Intel to produce a set of between eight and 12 ICs for a calculator. Then instead of designing a complete set of ICs, Ted Hoff, at Intel, designed an integrated circuit chip that could receive instructions, and perform simple integrated functions on data. The design became the 4004 microprocessor.

Intel produced a set of ICs, which could be programmed to perform different tasks. These were the first-ever microprocessors and soon Intel (short for Integrated Electronics) produced a general-purpose 4-bit microprocessor, named the 4004. In April 1970, Wayne Pickette proposed to Intel that they use the computer-on-a-chip for the Busicom project. Then, in December, Gilbert Hyatt filed a patent application entitled ‘Single Chip Integrated Circuit Computer Architecture’, the first basic patent on the microprocessor.

Image for post
Image for post
Figure: 4004 chip

The 4004 caused a revolution in the electronics industry as previous electronic systems had a fixed functionality. With this processor, the functionality could be programmed by software. Amazingly, by today’s standards, it could only handle four bits of data at a time (a nibble), contained 2000 transistors, had 46 instructions and allowed 4KB of program code and 1KB of data. From this humble start, the PC has since evolved using Intel microprocessors. Intel had previously been an innovative company and had produced the first memory device (static RAM, which uses six transistors for each bit stored in memory), the first DRAM (dynamic memory, which uses only one transistor for each bit stored in memory) and the first EPROM (which allows data to be downloaded to a device, which is then permanently stored).

In the same year, Intel announced the 1KB RAM chip, which was a significant increase over previously produced memory chip. Around the same time, one of Intel’s major partners, and also, as history has shown, competitors, Advanced Micro Devices (AMD) Incorporated was founded. It was started when Jerry Sanders and seven others left — yes, you’ve guessed it, Fairchild Semiconductor. The incubator for the electronics industry was producing many spin-off companies.

At the same time, the Xerox Corporation gathered a team at the Palo Alto Research Center (PARC) and gave them the objective of creating ‘the architecture of information.’ It would lead to many of the great developments of computing, including personal distributed computing, graphical user interfaces, the first commercial mouse, bit-mapped displays, Ethernet, client/server architecture, object-oriented programming, laser printing and many of the basic protocols of the Internet. Few research centres have ever been as creative, and forward-thinking as PARC was over those years.

In 1971, Gary Boone, of Texas Instruments, filed a patent application relating to a single-chip computer and the microprocessor was released in November. Also in the same year, Intel copied the 4004 microprocessor to Busicom, and then in 1974, Intel was a truly innovative company and was the first to develop an 8-bit microprocessor. Excited by the new 8-bit microprocessors, two kids from a private high school, Bill Gates and Paul Allen, rushed out to buy the new 8008 devices. This they believed would be the beginning of the end of the large, and expensive, mainframes (such as the IBM range) and minicomputers (such as the DEC PDP range). They bought the processors for the high price of $360 (possibly, a joke at the expense of the IBM System/360 mainframe), but even they could not make it support BASIC programming. Instead, they formed the Traf-O-Data company and used the 8008 to analyse ticker-tape read-outs of cars passing in a street. The company would close down in the following year (1973) after it had made $20000, but from this enterprising start, one of the leading computer companies in the world would grow: Microsoft (although it would initially be called Micro-soft).

Image for post
Image for post
Figure: Traf-o-Data

At the end of the 1970s, IBM’s virtual monopoly on computer systems started to erode from the high-powered end as DEC developed their range of minicomputers and from the low-powered-end by companies developing computers based around the newly available 8-bit micro­processors, such as the 6502 and the Z80. IBM’s main contenders, other than DEC, were Apple and Commodore who introduced a new type of computer — the personal computer (PC). The leading systems, at the time, were the Apple I and the Commodore PET.

Image for post
Image for post
Figure: Apple 1 computer

These captured the interest of the home user and for the first time, individuals had access to cheap computing power. These flagship computers spawned many others, such as the Sinclair ZX80/ZX81, the BBC microcomputer, the Sinclair Spectrum, the Commodore Vic-20 and the classic Apple II (all of which were based on the 6502 or Z80). Most of these computers were aimed at the lower end of the market and were mainly used for playing games and not for business applications. IBM finally decided, with the advice of Bill Gates, to use the 8088 for its version of the PC, and not, as they had first thought, to use the 8080 devices. Microsoft also persuaded IBM to introduce the IBM PC with a minimum of 64KB RAM, instead of the 16KB that IBM planned.

Image for post
Image for post
Figure: IBM PC

In 1973, the model for future computer systems occurred at Xerox’s PARC, when the Alto workstation was demonstrated with a bitmapped screen (showing the Cookie Monster, from Sesame Street). The following year, at Xerox, Bob Metcalfe demonstrated the Ethernet networking technology, which was destined to become the standard local area networking technique. It was far from perfect, as computers contended with each other for access to the network, but it was cheap and simple, and it worked relatively well.

IBM was also innovating at the time, creating a cheap floppy disk drive. They also produced the IBM 3340 hard disk unit (a Winchester disk) which had a recording head which sat on a cushion of air, 18 millionths of an inch above the platter. The disk was made with four platters, each was 8-inches in diameter, giving a total capacity of 70MB.

Image for post
Image for post

The days of IBM leading the field very quickly became numbered as Compaq managed to reverse-engineer the software which allowed the operating system to talk to the hardware — BIOS. Once they did this IBM struggled to set standards in the industry, and had several attempts to define new operating systems such as OS/2 and in defining new computer architectures, with MCA bus standard. The industry decided that common standards were more important than ones defined by a single company.

There’s a great demo of the development of computer systems from the 1980s to now here.

For over 100 years, IBM has been one of the top companies in the world. Is the cloud going to challenge them like never before? Are they too late to catch Amazon?

ASecuritySite: When Bob Met Alice

This publication brings together interesting articles…

Prof Bill Buchanan OBE

Written by

Professor of Cryptography. Serial innovator. Believer in fairness, justice & freedom. EU Citizen. Auld Reekie native. Old World Breaker. New World Creator.

ASecuritySite: When Bob Met Alice

This publication brings together interesting articles related to cyber security.

Prof Bill Buchanan OBE

Written by

Professor of Cryptography. Serial innovator. Believer in fairness, justice & freedom. EU Citizen. Auld Reekie native. Old World Breaker. New World Creator.

ASecuritySite: When Bob Met Alice

This publication brings together interesting articles related to cyber security.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store