Debugging Computer Generation part-2

Abdul Satar
To The Next Level
Published in
7 min readMar 18, 2017

--

Hi, and welcome to the journey about computer evolution. On my last article a wrote about The Big Bang of computer (History) part1–3, fill free to read it and I will be so pleased to hear from you.

Back to journey, We are going continue walking now in part-2, Debugging Computer Generation, to all greatest innovation, events that marked the generations, and how computer got into our lives.

First generation(1950)

On the early age computers were only destined for scientist, military, and engineer.

The inventor of Eniac J. Presper Eckert and John Mauchly visioned that computer could be used on other fields, and then developer the first business computer, called Unicav (Universal Automatic Computer). At the time people didn't understand the really potential of computer, that’s why computer didn't take too much attention from them. But all was about to change. In 1952 for the first time computer was used to predict U.S president election. From that day on computer is still used to make prediction.

UNIVAC computer

As we know from last post, ENIAC, used to be reprogrammed ever time new task have to be done, and as it used Vacuum tube and couldn’t take much time without blowing it up,then allays should be replaced. This problem made the Univac developers, used the concept of stored program, in turn computers didn’t have to be reprogrammed again.

The big challenge faced at this generation was, to support a computer operation, would require too much of us. As to program a computer was not an easy thing, because it used to be programmed using binary number also known as Machine Language (0 and 1). Binary number is what machine only can understand, (ex.: 00101101001010). As you can see, it is difficult to be understood by human.

Another characteristic of those generation computers were:

  • Consumed lot of electricity
  • Very costly
  • Generate lot of heat
  • Huge size

Second Generation

As the machines on the first generation had problems with vacuum tube, witch always were blowing up during the processing session, in 1947 at the Bell laboratories was invented transistor, as the solution of the problems, which was the key to the second computer generation.

A transistor is a semiconductor device used to amplify or switch electronic signals and electrical power. (wikipedia)

Transistor

The best of this big innovation regarding to the vacuum tube, was that computer drastically reduced the sizes, the consumption of power, heat produced, and not too much the price (the price reduced, but the cost was still a bit expensive) . One of the thing of this generation, it looked like computer we use today, but it still were used punched card as input and some components as type storage, storage disk and printer.

Other gain of this generation was the programs no longer need to be written on binary language, thank to introduce of the High level language programing. This was much easier to write programs because the language sound like human language like English, and then, the same program could run on another machine.

The two first High level language programming, were COBOL (Common Business Oriented Language) much used for business, and FORTRAN( Formula Translator) used by scientist an engineer.

Third Generation (Mid-1960s to Mid-1970s)

Looking back to the second generation, if someone wanted to run his program, he should wait his time or until the machines were free. The process was that: the user should give they punched cards to the computer operator, and he would introduce the card to the machine to be process a task and return the result to the user.

Dealing to this problem, was introduced on the computer the term time sharing, which enabled the users to use the same computer simultaneously, from some terminal interface display, using a keyboard to the input.

The main key in this generation was the Integrated Circuit (IC).

Integrated Circuit is a set of electronic circuits on one small flat piece (or “chip”) of semiconductor material, normally silicon.(Wikipedia).

Integrated Circuit.

Each Integrated Circuit could be placed more than 5000 transistor. This technology came with the promise to reduce completely the cost of computer. By the time, small company and universities could afford computer.

Many different type of programming language also were invented, and IBM (International Business Management), one of the leader one this field, started to sell software and hardware. By the time software and hardware were sold separately , was born the software industry.

From this generation were introduced the open architecture, people could personalize their computers, and more accessories could be produced from different company, making the computer a open market.

Fourth Generation (1975 )

The things were getting exciting in technology field by this time. From integrated circuit, things were becoming small. As more transistor(about 5000) could be placed in single ship of silicon, that was enough to create the processor unit.

Some one called Dr. Ted Hoff an Intel Corporation engineer, had to design an integrated circuit to the digital clock. The problem found was that he had to redesign the circuit over and over again, every time he would come with a new watch. The solution was to create a tiny computer in a chip, resulting in the world first microprocessor Intel 4004.

A microprocessor is a computer processor which incorporates the functions of a computer’s central processing unit (CPU) on a single integrated circuit (IC)(Wikpedia)

Microprocessor Intel 4004

During this period companies didn't took serious this big innovation called microprocessor, other people though that it was just a toy. In the mid of 1970 two young men called Steve (Jobs and Wozniak), had the vision that could make a computer so simple and intuitive that everyone could use. To materialize this dream they had to sell Volkswagen car, to rise enough capital for the investment, then in April of 1977 were created Apple Computer, Inc, with they Apple I.

The Market was open, then came Apple II, witch was a tremendous success, with a keyboard, monitor, floppy disk drive, and operating system, it was based on the Motorola 6502 microprocessor.

IBM company did not want to be left behind, they contracted Microsoft Corporation to produce the operating system, called MS-Dos. By the time, Apple and IBM were the leader of computer industry.

The computer produced on this time, was called PC(Personal Computer). They were not easy to use, to run a program, the user should enter the command, using a keyboard, from Command-line interface(CLI).

PC-Personal Computer

A Corporation called Xerox of Alto Research Center (PARC), developed the first Graphic User Interface(GUI) in the 1970s. User could interact with the programs, using a mouse(also developed by PARC), by clicking on icons that represented a function. This PARC’s technologies had influenced the coming computer resulting in Apple Computer Macintosh in 1984.

On software industry a lot of change happened. Programming language were becoming more High-Level, the Object Oriented language were introduced and program could make use of reusable components.

Fifth Generation

For each computer generation was target by some hardware evolution. First generation with Vacuum tube, second generation with transistor, thirty generation the integrated circuit, and forth generation microprocessor.

For this generation, one of the target was to use the concept of parallel processing to supplier support to Artificial Intelligence(AI).

The concept of Artificial intelligence, is about any device that perceives its environment and takes actions that maximize its chance of success at some goal.(Wikipedia).

This generation still in progress, some thing supported by this generation is natural language processing, voice recognition, image processing, virtual reality and supercomputer, those are some examples of what is being made in this generation.

Super Computer

There are a lot of study being made on this field of AI, using technologies like superconductor, and parallel processing. There are another emerging technology that will transform completely computer like quantum computation, molecular and nano technology.

Before you go…

This was the second part on our journey of computer evolution. The next post is about the how future computer are going to look like The Computer of the Tomorrow part-3.

If you loved reading about this article, let me know from comments or hitting the heart button to help other people see it.

--

--

Abdul Satar
To The Next Level

Fundador do Mastermindclub , Designer, Escritor, Emprendedor, Fascinated by technology, growth mindset, art, innovation all that move my curiosity.