Evolution of Computer Generations: From Microchips to AI
The Third Generation of Computers: Integrated Circuits
The fiber optic transmission capacity for chips defined the change to the third generation. In the mid-1960s came the invention of the integrated circuit, or microchip, which then led to the invention of the microprocessor at Intel. From this date, it became possible to pack several tiny transistors and other electronic components on a single chip, containing within it a complete circuit: an amplifier, an oscillator, or a logic gate. Naturally, with these integrated circuits, it was much easier to assemble complicated devices, such as radio or television receivers and computers. The most remarkable aspects of this third generation are:
- In 1965, IBM announced the first machine built with integrated circuits, which was named the System/360.
- The introduction of programs that are still used in large computers today.
The Fourth Generation of Computers: The Rise of Microchips
The creation of microchips marked the leap to the fourth generation. From 1970 to 1981, two improvements in computer technology marked the beginning of the fourth generation:
- Replacement of memories with magnetic cores by silicon chips.
- The placement of many more components on a chip, a product of the micro-miniaturization of electronic circuits.
The small size of microprocessor chips made possible the creation of personal computers (PCs). Today, integration technologies allow hundreds of thousands of electronic components to be stored on a chip. At this time, manufacturers could make a small computer that would rival the first-generation computers that occupied an entire room. Microcomputers, or PCs, made their appearance. The term “PC” is derived from the fact that in 1981, IBM released its model, the “IBM PC”, which became a kind of ideal computer for “personal” use, hence the term “PC”. Clones were subsequently standardized and adopted by other companies. These were called “PC compatible”, using the same type of processors as IBM, but at a lower cost and capable of running the same type of programs. There are other types of microcomputers, such as the Macintosh, which are not compatible with IBM, but in many cases, they are also called “PCs”.
The Fifth Generation of Computers: Artificial Intelligence
We are in the fifth generation of computers. In October 1981, the computer world was rocked by an announcement made in Japan: an initiative for research and development aimed at producing a new generation of computers in the first decade of the 1990s, which was given the name “fifth-generation computers”. The computers of this generation should be able to solve very complicated problems, some of which require all the experience, intelligence, and reasoning ability of individuals to be resolved. They should be able to handle large subsets of natural languages and operate within large knowledge bases. Despite the complexity of computers of this generation, they are being designed to be operated by non-experts in computer science. To achieve these ambitious goals, these teams will not have a single processor but a large number clustered in three main subsystems: an intelligent system, an inference mechanism, and a smart user interface. At 10 years old, the project failed, and these computers are not currently being developed. However, significant progress has been made in telecomputing, decreasing the size and cost of equipment, programming techniques, and the development of machine intelligence, process control, and robotics. We can therefore say that the fifth generation of computers is characterized by:
- Artificial intelligence: They have functions that are characteristic of humans, such as simulating human vision, recognizing and imitating human speech, analyzing data, and drawing conclusions.
- Storage also changed in this fifth generation. Now, information is stored on diskettes, CDs, DVDs, and flash memory.