digital computer | Evolution, Components, & Features

Destiny Viator

Development of the digital computer Blaise Pascal of France and Gottfried Wilhelm Leibniz of Germany invented mechanical digital calculating machines during the 17th century. The English inventor Charles Babbage, however, is generally credited with having conceived the first automatic digital computer. During the 1830s Babbage devised his so-called Analytical Engine, […]

Development of the digital computer

Blaise Pascal of France and Gottfried Wilhelm Leibniz of Germany invented mechanical digital calculating machines during the 17th century. The English inventor Charles Babbage, however, is generally credited with having conceived the first automatic digital computer. During the 1830s Babbage devised his so-called Analytical Engine, a mechanical device designed to combine basic arithmetic operations with decisions based on its own computations. Babbage’s plans embodied most of the fundamental elements of the modern digital computer. For example, they called for sequential control—i.e., program control that included branching, looping, and both arithmetic and storage units with automatic printout. Babbage’s device, however, was never completed and was forgotten until his writings were rediscovered over a century later.

Get exclusive access to content from our 1768 First Edition with your subscription.
Subscribe today

Of great importance in the evolution of the digital computer was the work of the English mathematician and logician George Boole. In various essays written during the mid-1800s, Boole discussed the analogy between the symbols of algebra and those of logic as used to represent logical forms and syllogisms. His formalism, operating on only 0 and 1, became the basis of what is now called Boolean algebra, on which computer switching theory and procedures are grounded.

John V. Atanasoff, an American mathematician and physicist, is credited with building the first electronic digital computer, which he constructed from 1939 to 1942 with the assistance of his graduate student Clifford E. Berry. Konrad Zuse, a German engineer acting in virtual isolation from developments elsewhere, completed construction in 1941 of the first operational program-controlled calculating machine (Z3). In 1944 Howard Aiken and a group of engineers at International Business Machines (IBM) Corporation completed work on the Harvard Mark I, a machine whose data-processing operations were controlled primarily by electric relays (switching devices).

Since the development of the Harvard Mark I, the digital computer has evolved at a rapid pace. The succession of advances in computer equipment, principally in logic circuitry, is often divided into generations, with each generation comprising a group of machines that share a common technology.

In 1946 J. Presper Eckert and John W. Mauchly, both of the University of Pennsylvania, constructed ENIAC (an acronym for electronic numerical integrator and computer), a digital machine and the first general-purpose, electronic computer. Its computing features were derived from Atanasoff’s machine; both computers included vacuum tubes instead of relays as their active logic elements, a feature that resulted in a significant increase in operating speed. The concept of a stored-program computer was introduced in the mid-1940s, and the idea of storing instruction codes as well as data in an electrically alterable memory was implemented in EDVAC (electronic discrete variable automatic computer).

The second computer generation began in the late 1950s, when digital machines using transistors became commercially available. Although this type of semiconductor device had been invented in 1948, more than 10 years of developmental work was needed to render it a viable alternative to the vacuum tube. The small size of the transistor, its greater reliability, and its relatively low power consumption made it vastly superior to the tube. Its use in computer circuitry permitted the manufacture of digital systems that were considerably more efficient, smaller, and faster than their first-generation ancestors.

The late 1960s and ’70s witnessed further dramatic advances in computer hardware. The first was the fabrication of the integrated circuit, a solid-state device containing hundreds of transistors, diodes, and resistors on a tiny silicon chip. This microcircuit made possible the production of mainframe (large-scale) computers of higher operating speeds, capacity, and reliability at significantly lower cost. Another type of third-generation computer that developed as a result of microelectronics was the minicomputer, a machine appreciably smaller than the standard mainframe but powerful enough to control the instruments of an entire scientific laboratory.

The development of large-scale integration (LSI) enabled hardware manufacturers to pack thousands of transistors and other related components on a single silicon chip about the size of a baby’s fingernail. Such microcircuitry yielded two devices that revolutionized computer technology. The first of these was the microprocessor, which is an integrated circuit that contains all the arithmetic, logic, and control circuitry of a central processing unit. Its production resulted in the development of microcomputers, systems no larger than portable television sets yet with substantial computing power. The other important device to emerge from LSI circuitry was the semiconductor memory. Consisting of only a few chips, this compact storage device is well suited for use in minicomputers and microcomputers. Moreover, it has found use in an increasing number of mainframes, particularly those designed for high-speed applications, because of its fast-access speed and large storage capacity. Such compact electronics led in the late 1970s to the development of the personal computer, a digital computer small and inexpensive enough to be used by ordinary consumers.

By the beginning of the 1980s integrated circuitry had advanced to very large-scale integration (VLSI). This design and manufacturing technology greatly increased the circuit density of microprocessor, memory, and support chips—i.e., those that serve to interface microprocessors with input-output devices. By the 1990s some VLSI circuits contained more than 3 million transistors on a silicon chip less than 0.3 square inch (2 square cm) in area.

The digital computers of the 1980s and ’90s employing LSI and VLSI technologies are frequently referred to as fourth-generation systems. Many of the microcomputers produced during the 1980s were equipped with a single chip on which circuits for processor, memory, and interface functions were integrated. (See also supercomputer.)

The use of personal computers grew through the 1980s and ’90s. The spread of the World Wide Web in the 1990s brought millions of users onto the Internet, the worldwide computer network, and by 2019 about 4.5 billion people, more than half the world’s population, had Internet access. Computers became smaller and faster and were ubiquitous in the early 21st century in smartphones and later tablet computers.

The Editors of Encyclopaedia Britannica

Source Article

Next Post

Modern technology: advantages and disadvantages

Today, technology is very important because it is used for almost everything and like everything, technology has advantages and disadvantages The invention of the computer was a very important point. Communication is thus enhanced, and companies can communicate more easily with foreign countries. Research is also simplified In the world […]
Modern technology: advantages and disadvantages