The term ‘computer generation’ refers to each phase of development in the field of computers relative to the hardware used. There are five generations:
|GENERATION||TIME PERIOD||PROCESSING UNIT|
|First Generation||From around 1940 to around 1956||Vacuum Tubes|
|Second Generation||From around 1956 to around 1963||Transistors|
|Third Generation||From around 1963 to around 1971||Integrated Circuits (ICs)|
|Fourth Generation||From around 1971 to around 1990||Microprocessors|
|Fifth Generation||From around 1990 onwards||Artificial Intelligence|
First Generation Computers – Vacuum Tubes
The first-generation computers utilized vacuum tubes for circuitry. Vacuum tubes were invented in 1904 by John Ambrose Fleming. The following is his original patent for first practical electron tube called the ‘Fleming Valve‘:
The features of the computers from the first generation were:
- Magnetic Drums are used as memory.
- Punched cards and paper tape are used for storage.
- Electric failure was a common occurrence for the first-generation computers. Hence, they were unreliable.
- The system consumed a large amount of energy, producing a lot of heat. Thus, air conditioning was required.
- The computers were extremely large in size, occupying the space taken up by a big room. Due to this, they were non-portable.
- The programming for these computers was done in assembly and machine languages.
- They could process data at a slow operating speed.
- These computers consumed a lot of electricity.
The major computers from the first generation are as follows: ENIAC (Electronic Numerical Integrated And Calculator), EDVAC (Electronic Discrete Variable Automatic Computer), EDSAC (Electronic Delay Storage Automatic Computer) and UNIVAC-I (this was a computer built by Univac Division of Remington Rand and utilizes Control Panels with Switches).
Second Generation Computers – Transistors
A transistor was a small device used to transfer electronic signals across a resistor. The scientists William Shockley, Walter Houser Brattain and John Barden developed the transistor in 1947. Like vacuum tubes, transistors could be used as amplifiers or as switches. But they had several major advantages. They were a fraction the size of vacuum tubes (typically about as big as a pea), used no power at all unless they were in operation, and were virtually 100 percent reliable. The transistor was one of the most important breakthroughs in the history of computing and it earned its inventors the world’s greatest science prize, the 1956 Nobel Prize in Physics.
The features of the computers from the second generation were:
- Magnetic Core is used as memory.
- Hard-disk and Magnetic Tape are used for storage.
- The amount of heat produced was much lesser than the earlier computers. However, cooling was still required.
- The computers were smaller in size in comparison to those of the first generation.
- The programming for these computers was done in assembly and machine languages.
- Lower electricity consumption than that of the first-generation.
- These are most suitable for scientific and bulk data processing tasks.
- Frequent maintenance is required for these computers.
- The input and output devices used are teletypewriters and punched cards.
The major computers from the second generation are as follows: IBM 1400 series, IBM 1700 series, and Control Data 3600.
Third Generation Computers – Integrated Circuits (ICs)
Although transistors were a great advance on vacuum tubes, one key problem remained. Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error-prone. Thus, Jack Kilby invented the “monolithic” integrated circuit (IC), a collection of transistors and other components that could be manufactured all at once, in a block, on the surface of a semiconductor. Kilby’s invention was another step forward, but it also had a drawback: the components in his integrated circuit still had to be connected by hand. While Kilby was making his breakthrough in Dallas, unknown to him, Robert Noyce was perfecting almost exactly the same idea in California. Noyce found a way to include the connections between components in an integrated circuit, thus automating the entire process. Thus, in 1961, semiconductors or ICs were launched and led to a massive increase in speed and efficiency of these machines.
The feature of the computers from the third generation are:
- Core Memory and DRAM chips are used as memory.
- Hard-disk and Floppy Disk are used for storage.
- Less human labor was required for assembly.
- The computers became smaller, more reliable and faster.
- The programming for these computers was done with High-Level Languages like FORTRAN, BASIC, etc.
- Lower power consumption than those of the previous generations.
- The input and output devices used are keyboards and printers.
- The size of the main memories increased substantially.
The major computers from the third generation are as follows: IBM 360 series, IBM 370/168 series, ICL 1900 series, ICL 2900, Honeywell Model 316, Honeywell 6000 series, ICL 2903, and CDC 1700.
Fourth Generation Computers – Microprocessors
As the 1960s wore on, integrated circuits became increasingly sophisticated and compact. Soon, engineers were speaking of large-scale integration (LSI), in which hundreds of components could be crammed onto a single chip, and then very large-scale integrated (VLSI), when the same chip could contain thousands of components. The logical conclusion of all this miniaturization was that, someday, someone would be able to squeeze an entire computer onto a chip. The fourth generation of computers is marked by the creation of microprocessors and Very Large Scale Integrated (VLSI) circuits. These were developed by Intel (specifically by Marian “Ted” Hoff).
Fourth Generation computers became more compact, reliable and affordable. By 1974, Intel had launched a popular microprocessor known as the 8080 and computer hobbyists were soon building home computers around it. The first was the MITS Altair 8800, built by Ed Roberts. With its front panel covered in red LED lights and toggle switches, it was a far cry from modern PCs and laptops. Even so, it sold by the thousands and gave rise to personal computer (PC) revolution.
The Altair inspired a Californian electronics wizard name Steve Wozniak to develop a computer of his own. “Woz” often described as the hacker’s “hacker”—a technically brilliant and highly creative engineer who pushed the boundaries of computing largely for his own amusement. In the mid-1970s, he was working at the Hewlett-Packard computer company in California. After seeing the Altair, Woz used a 6502 microprocessor (made by an Intel rival, Mos Technology) to build a better home computer of his own: the Apple I. When he showed off his machine, one of his friends, Steve Jobs, persuaded him into building a business around the machine itself. Woz agreed so, famously, they set up Apple Computer Corporation in a garage belonging to Jobs’ parents. After selling 175 of the Apple I for the devilish price of $666.66, Woz built a much better machine called the Apple ][ (pronounced “Apple Two”). Launched in April 1977, it was the world’s first easy-to-use home “microcomputer.” Soon home users, schools, and small businesses were buying the machine in their tens of thousands—at $1298 a time.
Two things turned the Apple][ into a really credible machine for small firms: a disk drive unit, launched in 1978, which made it easy to store data; and a spreadsheet program called VisiCalc, which gave Apple users the ability to analyze that data. In just two and a half years, Apple sold around 50,000 of these machines, quickly accelerating out of Jobs’ garage to become one of the world’s biggest companies. Dozens of other microcomputers were launched around this time, including the TRS-80 from Radio Shack (Tandy in the UK) and the Commodore PET.
Apple’s success selling to businesses came as a great shock to IBM and the other big companies that dominated the computer industry. In 1980, IBM finally realized it had to do something and launched a highly streamlined project to save its business. One year later, it released the IBM Personal Computer (PC), based on an Intel 8080 microprocessor, which rapidly reversed the company’s fortunes and stole the market back from Apple.
The PC was successful essentially for one reason. All the dozens of microcomputers that had been launched in the 1970s—including the Apple ][—were incompatible. All used different hardware and worked in different ways. Most were programmed using a simple, English-like language called BASIC, but each one used its own flavor of BASIC, which was tied closely to the machine’s hardware design. As a result, programs written for one machine would generally not run on another one without a great deal of conversion. Companies who wrote software professionally typically wrote it just for one machine and, consequently, there was no software industry to speak of.
In 1976, Gary Kildall, a teacher and computer scientist had figured out a solution to this problem. Kildall wrote an operating system (a computer’s fundamental control software) called CP/M that acted as an intermediary between the user’s programs and the machine’s hardware. Then all those machines could run identical user programs—without any modification at all—inside CP/M. That would make all the different microcomputers compatible at a stroke. By the early 1980s, Kildall had become a multimillionaire through the success of his invention: the first personal computer operating system.
Naturally, when IBM was developing its personal computer, it approached him hoping to put CP/M on its own machine. Legend has it that Kildall was out flying his personal plane when IBM called, so missed out on one of the world’s greatest deals. But the truth seems to have been that IBM wanted to buy CP/M outright for just $200,000, while Kildall recognized his product was worth millions more and refused to sell. Instead, IBM turned to a young programmer named Bill Gates. His, then, tiny company, Microsoft, rapidly put together an operating system called DOS, based on a product called QDOS (Quick and Dirty Operating System), which they acquired from Seattle Computer Products. The IBM PC, powered by Microsoft’s operating system, was a runaway success.
Yet IBM’s victory was short-lived. Cannily, Bill Gates had sold IBM the rights to one flavor of DOS (PC-DOS) and retained the rights to a very similar version (MS-DOS) for his own use. When other computer manufacturers, notably Compaq and Dell, started making IBM-compatible (or “cloned”) hardware, they too came to Gates for the software. IBM charged a premium for machines that carried its badge, but consumers soon realized that PCs were commodities: they contained almost identical components—an Intel microprocessor, for example—no matter whose name they had on the case. As IBM lost market share, the ultimate victors were Microsoft and Intel, who were soon supplying the software and hardware for almost every PC on the planet. Apple, IBM, and Kildall made a great deal of money—but all failed to capitalize decisively on their early success.
Fortunately for Apple, it had another great idea. One of the Apple II’s strongest suits was its sheer “user-friendliness.” For Steve Jobs, developing truly easy-to-use computers became a personal mission in the early 1980s. Jobs launched an easy-to-use computer called PITS (Person In The Street). This machine became the Apple Lisa, launched in January 1983—the first widely available computer with a GUI desktop. It paved the way for a better, cheaper machine called the Macintosh that Jobs unveiled a year later, in January 1984. With its memorable launch ad for the Macintosh inspired by George Orwell’s novel 1984, and directed by Ridley Scott (director of the dystopic movie Blade Runner), Apple took a swipe at IBM’s monopoly, criticizing what it portrayed as the firm’s domineering—even totalitarian—approach: Big Blue was really Big Brother. Apple’s ad promised a very different vision: “On January 24, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like ‘1984’.” The Macintosh was a critical success and helped to invent the new field of desktop publishing in the mid-1980s, yet it never came close to challenging IBM’s position.
Ironically, Jobs’ easy-to-use machine also helped Microsoft to dislodge IBM as the world’s leading force in computing. When Bill Gates saw how the Macintosh worked, with its easy-to-use picture-icon desktop, he launched Windows, an upgraded version of his MS-DOS software. Apple saw this as blatant plagiarism and filed a $5.5 billion copyright lawsuit in 1988. Four years later, the case collapsed with Microsoft effectively securing the right to use the Macintosh “look and feel” in all present and future versions of Windows. Microsoft’s Windows 95 system, launched three years later, had an easy-to-use, Macintosh-like desktop and MS-DOS running behind the scenes.
The salient features of the fourth-generation computers are:
- Microcomputer series were developed (such as IBM and APPLE).
- Portable computers were developed.
- Memory chips were used as main memory.
- Hard-disks, CDs, DVDs, flash memories, blue-ray discs, floppy disks, and cloud were used for storage.
- The computers became smaller, more reliable and faster. Their storage capabilities increased and there was a great development in data communication.
- Computer costs came down rapidly. Thus, Personal Computers (PCs) became common.
- The programming for these computers was done with Higher Level Languages like C and C++, DBASE, etc.
- The input and output devices increased in number.
Input devices – Keyboards, mouse, joysticks, voice input, etc.
Output devices – Printers, plotters, speakers, etc.
- No air conditioning was required to control the heat released because the computers came with a fan for heat discharging.
Some of the major computers from the fourth generation are as follows: Pentium (80286, 80386, 80486, P5, dual-core, quad-core etc.), Power PC, AMD, Apple Macintosh, IBM, Dell, and Several RISC (Reduced Instruction Set Computers).
Fifth Generation Computers – Artificial Intelligence
Today Computers aren’t what they used to be: they’re much less noticeable because they’re much more seamlessly integrated into everyday life. Some are “embedded” into household gadgets like coffee makers or televisions. Others travel around in our pockets in our smartphones—essentially pocket computers that we can program simply by downloading “apps” (applications).
The fifth generation of computers marks the shift to a more technologically advanced and efficient era of computers. Artificial Intelligence is the focus of this generation, where we move forward from and develop some modern applications such as voice recognition or advanced robotics. The goal of future development of computers is to be able to learn and self-organize while responding to natural language input. Computers will be able to classify information, search large databases quickly and plan on the basis of their own thinking and decision-making.
The features of this fifth generation for computers are:
- Parallel Processing – in this, many processors are grouped to function as one large group processor.
- Superconductors – a superconductor is a conductor through which electricity can travel without any resistance resulting in faster transfer of information between the several parts of a computer.
- Artificial Intelligence would result in making everyday activities easier.
- Intelligence systems can control the route of a missile and can defend us from attacks.
- Word processors can recognize speech and can type out the same.
- Programs are now able to translate documents from one language to another with ease.
- AI has led to thinking machines that are capable of accomplishing beyond human boundaries. Robots have been developed to do the jobs humans do currently.
- While AI is the focus of the fifth generation and beyond, we still look for improvements in our personal computers, phones, and other recent technology. For these, the input/output devices and the memory remain largely the same.