laptop

The main stages in the development of computer technology

Read Time:3 Minute, 52 Second

Computer science is a field of technology that combines the means to automate mathematical calculations and information processing in relation to a wide range of human activities.

There are four periods in the history of computer science:

  1. Pre-mechanical period (from ancient times till middle of 17th century).

A simple and quite efficient counting device, which was already known in the 5th – 4th century. Before our era, there was the “abacus”, that is, a board made of wood, bronze, stone, ivory, colored glass. The board had recesses in the form of strips into which counting pips or balls were placed.

Calculations were made by moving pebbles or pebbles in a certain order.

The great Renaissance creator Leonardo da Vinci developed a 13-digit adder with ten-toothed wheels.

  1. Mechanical period (from the middle of the 17th century to the end of the 19th century).

During the Middle Ages, calculations increased greatly as trade and shipping developed. To predict tides, scientists observed the moon and developed enormous tables in which they recorded changes in the position of the moon, which were used to check the accuracy of various formulae for the moon’s motion. To facilitate and speed up such checks, the development of computational mechanisms began.

The first computer model capable of performing arithmetic operations of addition and subtraction was developed in 1642 by the 18-year-old French mathematician and physicist Blaise Pascal. The Pascaline contained a set of vertically arranged wheels with numbers from 0 to 9 inscribed on them. When one such wheel made a complete revolution, it became engaged with a neighboring wheel and rotated it by one mark.

A calculating machine that performed addition, subtraction, multiplication, and division mechanically was first invented in 1670 by the great German inventor Gottfried Wilhelm Leibniz. The machine was called an “arithmometer.”

Professor Charles Babbage of Cambridge University in 1834 invented the Analytical Engine, the principle of which was based on programming with punched cards and manual data input, but because of the lack of technological development Professor C. Babbage was deprived of the opportunity to realize this project.

The first programs for the Analytical Engine were created by Ada Augusta Lovelace, daughter of English poet George Byron.

  1. electromechanical period (from the late 19th century to the 1940s).

The turn of the 19th and 20th centuries saw a transition from mechanical computers to electromechanical computers. In 1937 in the USA, mathematician Howard Aitken made the first successful attempt to create a universal digital machine known at that time. This machine was called “automatic sequence control computing machine,” but was better known as “Mark-I.” G. Aitken worked on the first version of the machine until 1944, the machine was based on IBM. It had software controls, the program was typed on panels and switches.

  1. the electronic period (from the 1940s to the present).

The famous English mathematician Alan Turing in 1936 introduced the concept of an abstract computer. In the journal of the London Mathematical Society in 1936, Turing published an article proving that in principle any algorithm could be implemented by means of a discrete automaton.

To solve systems of linear equations, in 1937 John Atanasoff, an American physicist of Bulgarian origin and assistant professor at Iowa State College, USA, formulated the principles of an automatic computer on tube circuits.

John Mauchly put forward his version of a computer for military use in August 1942. In May 1943, Mauchly’s project was commissioned by the U.S. Army Ballistics Research Laboratory. The work was conducted at the University of Pennsylvania under the direction of J. Moochley and electronic engineer Presper Eckert. The scientists began creating a computer based on vacuum tubes instead of electrical relays. They named their machine ENIAC, which means “electronic digital computer.” The last version of the ENIAC machine was put into service on February 15, 1946. In ENIAC, information storage, arithmetic and logical operations based on the binary decimal number system were performed by circuits with 18,000 radio tubes. The computer computed instantaneously: a hundred thousand times faster than a human and a thousand times faster than the most advanced mechanical machines of the time. The use of electronic tubes entailed a qualitative leap in the speed of computers.

ENIAC was to be used to compile ballistic tables for U.S. coastal defense guns. ENIAC was first used in practice in the USA to obtain solutions to the problems of the top-secret atomic bomb project at Los Alamos Laboratory in New Mexico. Later ENIAC began to be used in America for mainly military purposes: to produce artillery tables and tables for precise dropping bombs from airplanes.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %
woman Previous post What to pay attention to in IT courses, trainings and certificates
laptop Next post How to become a tester: a path open not only to programmers