Evolution of Operating Systems

History

Zero Generation (Late 1940s)

The first computer systems did not have operating systems. Users had full access to the machine language. All instructions were coded by hand.

First Generation (Late 1950s) – Vacuum Tubes and Plug-in Boards

At this time, there were no operating systems. The use of vacuum tube technology resulted in enormous machines. Users had to create all programs in binary code, and the machines were very expensive.

Early First Generation

The operating systems of the 1950s were designed to smooth the transition between jobs. Before these systems, considerable time was lost between the completion of one job and the start of the next. This marked the beginning of batch systems, where jobs are gathered in groups or batches. When a job was in progress, it had total control of the machine. After each job, control returned to the operating system, which cleaned up, read, and started the next job.

In the early 1950s, the introduction of punch cards (used to input machine language programs) improved the process, as plug-in boards were no longer necessary.

Furthermore, the General Motors Research Laboratory implemented the first operating system for the IBM 701. Systems from the 1950s typically executed a single task, and the transition between jobs was streamlined to maximize system utilization. This is known as single-stream batch processing, as programs and data were submitted in groups or batches.

The introduction of the transistor in the mid-1950s significantly changed the landscape.

Late First Generation

Machines became sufficiently reliable to be installed in specially equipped locations, although only large universities, corporations, or government offices could afford them.

To run a job (program), one had to write it on paper (in Fortran or assembly language) and then punch it onto cards. Then, you would take the stack of cards to the system’s input area and hand it to an operator. When the computer finished the job, an operator would retrieve the output from the printer and deliver it to the designated output area.

Second Generation (Mid-Late 1960s)

Operating systems in this era were characterized by the development of shared systems with multiprogramming, multiprocessing, and time-sharing principles.

In multiprogramming systems, several user programs reside in main storage simultaneously, and the processor rapidly switches between them.

Multiprocessing systems utilize multiple processors within a single computer system to increase processing power.

Device independence also emerged. In first-generation systems, a user writing data to a tape had to specifically reference a particular tape drive. In the second generation, the user program only specified that a file would be written to a tape drive with a certain number of tracks and density.

Shared systems were developed, allowing users to directly connect to the computer via terminals. Real-time systems also appeared, where computers were used for industrial process control. Real-time systems are characterized by their immediate response times.

Third Generation (Mid-1960s to Early 1970s)

This generation began in 1964 with the introduction of IBM’s System/360 family of computers. These computers were designed as general-purpose systems.

Systems were typically large and bulky, aiming to be versatile. They were multi-modal, with some handling batch processing, timesharing, real-time processing, and multiprocessing concurrently.

These systems were large and expensive, unlike anything built before. Many development efforts went over budget and exceeded planned completion dates.

They also created more complex computing environments, a complexity that initially challenged users.

Fourth Generation (Mid-1970s Onwards)

Fourth-generation systems represent the current state of technology. Many designers and users still find them challenging, even after their experience with third-generation operating systems.

With the expanded use of computer networks and online processing, users can access geographically remote computers through various terminals.

Security systems have become much more robust, as information now travels through various vulnerable communication lines. Encryption has gained significant attention, as it’s become necessary to encrypt sensitive data so that even if exposed, it’s useless to anyone but the intended recipients.

The percentage of the population with computer access has dramatically increased since the 1980s and continues to grow rapidly.

The concept of virtual machines is now widely used. Users are no longer concerned with physical details of the computer system they’re accessing. Instead, they interact with a virtual machine created by the operating system.

Database systems have become crucial. Our world is information-oriented, and databases ensure that this information is accessible in a controlled manner to authorized users.

Types of Operating Systems

Batch Processing

Batch processing, or batch sequence on microcomputers, involves executing a list of operating system commands one after another without user intervention.

On larger computers, it refers to collecting programs and data sets from users, executing them one or a few at a time, and allocating resources to users.

Batch processing can also refer to storing transactions over a period before sending them to a master file, usually done overnight as a separate operation.

Batch Operating Systems

Batch operating systems process programs in groups (batches) rather than individually.

These systems loaded a program into memory from a tape and ran it. At the end of the program, the system jumped to a memory address where the operating system resumed control, loaded the next program, and ran it.

This significantly reduced the time between jobs.

Early batch systems were characterized by grouping similar jobs together. Modern systems use other features. The defining characteristic of a batch system is the lack of interaction between the user and the job while it’s running.

The job is prepared and submitted. The output appears later.

Batch operating systems process large amounts of work with minimal or no interaction between users and running programs.

They handle all common tasks simultaneously, avoiding the wait times associated with serial processing of two or more jobs.

These systems are older and more traditional, introduced around 1956 to increase program processing capacity.

Well-planned batch systems can achieve high execution times because the processor is used efficiently, and the operating systems can be simple due to the sequential execution of jobs.

Examples of successful batch operating systems include SCOPE for heavy scientific processing on the CDC 6600 and UNIVAC EXEC II for academic processing on the UNIVAC 1107.