Computer Generation : The Fabulous Five

The computer as we know it today had its beginning with a 19th century English mathematics professor name Charles Babbage. He designed the Analytical Engine and it was this design that the basic framework of the computers of today are based on. In its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations. Every computer supports some form of input, processing, and output. Regardless, this is what computing is all about, in a nutshell. We input information; the computer processes it according to its basic logic or the program currently running, and outputs the results.

First Generation Computers (1940 – 1950) – Vacuum Tubes

The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. The ENIAC (Electronic Numerical Integrator and Computer) computers are examples of first-generation computing devices. It was digital, although it didn’t operate with binary code, and was re-programmable to solve a complete range of computing problems.

In 1937 the first electronic digital computer was built by Dr. John V. Atanasoff and Clifford Berry. It was called the Atanasoff-Berry Computer (ABC). In 1943 an electronic computer name the Colossus was built for the military. Other similar computers developments of this era included German Z3, LEO, Harvard Mark I, and UNIVAC.

Second Generation Computers (1955 – 1960) – Transistors

The second generation of computers came about thanks to the invention of the transistor, which was more reliable than vacuum tubes. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards. The transistor allows computers to become smaller, faster, cheaper, energy-efficient and reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube.

The first transistor computer was created at the University of Manchester in 1953. In this generation, magnetic cores were used as primary memory and magnetic tape and magnetic disks as secondary storage devices.The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC.

Third Generation (1964-1971) Integrated Circuits

The invention of integrated circuit brought us the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.With this invention, computers became smaller and they are able to run many different programs at the same time. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip.

As a result of the various improvements to the development of the computer we have seen the computer being used in all areas of life. In 1980, Microsoft Disk Operating System (MS-Dos) was born and in 1981 IBM introduced the personal computer (PC) for home and office use. Three years later Apple gave us the Macintosh computer with its icon driven interface and the 90s gave us Windows operating system. It is a very useful tool that will continue to experience new development as time passes.

Fourth Generation (1971-Present) – Microprocessors

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer from the central processing unit and memory to input/output controls on a single chip. The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today.

In this generation time sharing, real time, networks, Distributed Operating System (DOS) were used. All the Higher level languages like C and C++, DBASE etc. also been used in this generation. In 1981, IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.

Fifth Generation (Present and Beyond) – Artificial Intelligence

Fifth generation computing devices, based on AI (Artificial Intelligence), are still in development, though there are some applications, such as voice recognition, that are being used today. AI is an emerging branch in computer science, which interprets means and method of making computers think like human beings. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. AI includes robotics, neural networks, game playing, development of expert systems to make decisions in real life situations and natural language understanding and generation.

Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

Read more:

http://www.historyofcomputer.org/

http://people.bu.edu/baws/brief%20computer%20history.html

http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp

http://www.tutorialspoint.com/computer_fundamentals/computer_generations.htm

Leave a comment