Antikythera mechanism - ancient Greek computing device (100 BC)

The history of the invention of the first mechanism for computing originates in ancient Greece. The mechanism, consisting of 37 bronze gears and four disks and intended, according to scientists, to calculate the movement of celestial bodies, was found in 1901 on a sunken ancient ship near the Greek island of Antikythera. The find dates back to approximately 100-150 BC. e. The ancient astronomical computer calculated the positions of the five known planets at the time and performed mathematical calculations.

The found fragments of the Antikythera Mechanism are kept in the National Archaeological Museum in Athens. Unfortunately, we will never know who invented this mechanism that was ahead of its time.

Computing device idea

Computer(English) computer- “calculator”) - a device that performs a given sequence of operations (most often related to numerical calculations and data manipulation).

computer- a device whose computing functionality is based on electronic components: vacuum tubes, semiconductors, resistors, capacitors.

The history of the invention of the first computer , perhaps, begins with the ideas of the famous Italian inventor. Back in the 15th century, in his diaries, Leonardo da Vinci gave a sketch of a adding device based on gear rings. (although Leonardo didn’t get beyond the drawings because the technologies of that time were very primitive for the implementation of his ideas).

Only two centuries later, the brilliant mathematician Pascal, with great difficulty, managed to bring to life his project of the mechanical adding machine “Pascalina”.

History of the invention of computers is divided into peculiar eras: counting objects on pebbles or bones was transformed into the ancestor of modern counting, the era of gears and levers gave humanity Pascaline’s mechanical calculator, later the world saw Babbage’s difference engine and, finally, having mastered electricity, man was able to build an electronic computer(COMPUTER).

What is a computer and what is not? von Neumann machine

John von Neumann laid down the fundamental principles by which modern computers are still created today. Von Neumann architecture- a well-known principle of joint storage of commands and data in computer memory. In other words, this means that both the data and the program code that operates with this data are located in the same memory (RAM).

A typical diagram of a von Neumann computing machine (computer) is presented below. It consists of main components:

  1. Arithmetic logic unit
  2. ALU control
  3. RAM
  4. I/O device

Wondering who invented the first computer, it is necessary to understand the difference between mechanical computing devices and electronic computers. ABC is considered the first electronic digital computer(Atanasoff-Berry Computer) - an Atanasoff-Berry computer developed by physicist John Atanasoff and Cliford Berry at the University of Iowa between 1937 and 1942. So Officially, the history of the invention of the first computer dates back to 1942.

The era of mechanical calculators

Ancient calculator Abacus - the progenitor of the account

Abacus - the ancient progenitor of counting

The very first computing device was the Abacus. This invention is more than two thousand years old. The abacus was a wooden board with stripes along which pebbles moved. A similar principle of operation can be seen in modern abacus, which are distant relatives of Abacus.

Pascal's first mechanical calculator

Pascal's mechanical computer. The laurels of the inventor of the first working mechanical counting mechanism belong to the French mathematician, physicist, and inventor Blaise Pascal (June 19, 1623 - August 19, 1662). This mechanical adding machine could perform four basic mathematical operations. During his short life, Pascal produced 50 of these mechanical calculators.

Charles Babbage is an English mathematician, creator of the first analytical engine, which is the prototype of the modern computer. The idea of ​​the analytical engine was based on the principles of a modern digital computer: input-output device, memory cells, arithmetic unit. Babbage's mechanical computer performed algebraic calculations i.e. operated with variables.

Electronic-mechanical computer Z-1 by Konrad Zuzze

In 1938, the German engineer Konrad Zuse, using his own funds, constructed the first mechanical programmable digital machine. It was driven by an electric drive and was located on two tables moved together, occupying an area of ​​4 m / cubic meter. If it weren’t for the bombings during the war that destroyed the Z-1, history of the invention of the first computer would be counted from 1938.

In the same year, Zuse began creating a more advanced model, the Z2, which was based on telephone relays. 1941: Zuse creates the Z3, which was the prototype modern computer. Z3 could be programmed in binary code, could perform calculations on floating point numbers, had a data storage device and could read programs from punched tape (!). Zuse's plans were to create the next generation Z using vacuum tubes, but due to the German military campaign he was denied funding.

After the war, Zuse continued to develop computer technology within the walls of his own company, Zuse KG. Later his company was bought by Siemens. Konrad Zuse was not only a brilliant inventor, but also a talented artist.

Computer Colossus

Computer "Colossus" - a top-secret British project

During World War II, German radio operators used a special encryption algorithm to transmit secret data.

To speed up the decryption of German messages, the British engineer Tommy Flowers, together with the department of Max Newman, created the Colossus decryption machine in 1943.

The Colossus computer used a large number of vacuum tubes, and information was entered from punched tape. The work of Flowers and Newman was not appreciated because... for a long time was classified. Winston Churchill personally signed the order to destroy the deciphering machine into pieces. Due to the strictest secrecy, history of the invention of the computer Colossus was not mentioned in historical works.

John Atanasoff's first electronic computer ABC

1942 John Atanasov, together with Clifford Berry, developed the first electronic digital computer ABC. This electronic machine was not programmable. ABC was the world's first computer WITHOUT MOVING PARTS (relays, cam mechanisms, etc...). On this moment and according to the law, based on electronic components belongs to John Atanasov.

For a long time it was believed that invention of the first computer owned by Eckert and Mauchly, but after lengthy litigation in 1973, federal judge Earl Larson invalidated the patent previously owned by Eckert and Mauchly, recognizing John Atanasov as the inventor of the first electronic computer.

Eckert Computer - Moshli ENIAC

In 1946, John Mauchly and John Eckert, together with staff from the Moore School of Electrical Engineering at Penn State, developed a large electronic computer designed for military purposes, the Electrical Numerical Integrator and Calculator. ENIAC was implemented on vacuum tubes, which significantly accelerated the process of processing and data operations. The weight of the computer was 27 tons. All calculations were performed in the decimal system. To change the reference (the program being executed), the ENIAC had to be rewired. The enormous computing power (at the time) of ENIAC was used for military purposes, then for weather forecasting.

What are computers made of?

At the heart of any computer is an arithmetic-logical unit (ALU, processor), memory for storing intermediate calculation results, and an input-output device. The first computer components were implemented using relays and radio tubes. Later, with the advent of transistors and microcircuits, the size of computers decreased significantly, and computing power, on the contrary, increased.

Vacuum triode - the basis of the first electronic computers

The first computers used vacuum triodes (radio tubes) invented by Lee De Forest in 1906. The triode consists of three elements placed under vacuum in a glass container: the cathode anode and a grid located between them. A voltage is applied between the anode and cathode. The current between the anode and cathode can be changed by applying different potentials to the grid. That. you can change the state of the triode: on/off. A triode (in our time a transistor) is a gate, a discrete unit of a computer on the basis of which more complex logic circuits are built.

In addition to radio tubes, passive electronic components were also widely used: resistors, capacitors. However, only radio tubes failed more often than all the others. This is due to the very architecture of these vacuum devices: any radio tube has a service life and it is quite short (relative to a semiconductor transistor, for example). Over time, the cathode of the radio tube rapidly loses emission and the radio tube becomes unusable.

RAM of the first computers

The first RAM was implemented on ferrite rings assembled into a matrix. This RAM stored information in the form of the direction of magnetization of small ferrite cores. The direction of magnetization of one ferrite ring allows one bit of information to be stored. This method of storing data was common until the mid-1970s.

History of the invention of computers. Our days

After the invention of the semiconductor transistor (1947) and the microcircuit (1952), the creation of computers reached a qualitatively new level. Thanks to its small size, high speed switching and low power consumption, semiconductor devices and microcircuits have made it possible to develop high-speed computers for all areas of application.

IBM can be called the inventor of the first personal computer, or more precisely, the open architecture of the IBM PC, which is a prefabricated structure with expansion slots and support for software and hardware various companies. The IBM PC standard is the dominant architecture on which all modern computers are now manufactured.

First Personal Computer The IBM-PC 5150 set a new standard in the microcomputer industry.

Moore's law and the future of computers

Gordon Moore's Law is an empirical observation (which worked perfectly until recently) that predicts a doubling of the number of transistors on a processor die approximately every 24 months. Thanks to the efforts of monsters in the central and video processor industry such as Intel and Nvidia, we live in an amazing era of virtualization, computer games indistinguishable graphics from Hollywood action.
Number of transistors Intel processors is approaching two billion, and the crystal of the microcircuit itself can fit on a fingernail. By combining computing cores on one substrate, and the processors themselves on a common motherboard developers have achieved fantastic computing power. Special effects design and virtual reality, modeling of complex biological processes, astronomy and astrophysics are just a few areas where the use of powerful modern computers helps humanity to rapidly develop and understand the world around us.

Few people know that the mathematical foundations of computer science and computer technology appeared in the Russian Empire. Who invented the first Russian computer, what BESM is, who benefits from the machine instead of the proletariat, and why there is not a single significant computer manufacturer in the country - T&P publish a chapter from Lauren Graham’s book “Can Russia Compete?” , published by Mann, Ivanov and Ferber.

The Russians were also pioneers in the development of computing devices, electronic computers (computers), and the mathematical foundations of computer science. In the last years of the Russian Empire, Russian engineers and scientists took important steps towards the development of computing devices. During the Soviet period, a whole group of mathematicians, among them Vladimir Kotelnikov, Andrei Kolmogorov, Israel Gelfand and others, made a significant contribution to the development of information theory. Soviet scientists and engineers created the first digital electronic computer in continental Europe. When American and Soviet engineers began collaborating on space exploration, in some cases the Soviet engineers "calculated" problems much faster than their American counterparts. However, in subsequent years, interest in computers increasingly moved to the commercial plane, and Soviet Union couldn't stand the competition. Soviet scientists working in the field of computing technology were forced to abandon their developments and adopt IBM standards. Today, not a single significant computer manufacturer from Russia is represented on the international market.

“Few in the West know that two years earlier, Russian logician Viktor Shestakov put forward a similar theory of ladder circuits based on Boolean algebra, but he did not publish his work until 1941.”

Russians began to show scientific activity quite early in the development of computers, information theory and computers. Even before the 1917 revolution, Russian engineers and scientists had made significant progress in this area. Russian naval engineer and mathematician Alexei Krylov (1863–1945) was interested in the application of mathematical methods in shipbuilding. In 1904 he created automatic device for solving differential equations. Another young engineer, Mikhail Bonch-Bruevich (1888–1940), also working in St. Petersburg, worked on vacuum tubes and their use in radio engineering. Around 1916, he invented one of the first two-position relays (the so-called cathode relay) based on an electrical circuit with two cathode tubes.

One of the pioneers of information theory in the West was Claude Shannon. In 1937, he defended his master's thesis at the Massachusetts Institute of Technology, in which he demonstrated that relay complexes, combined with the binary number system, could be used to solve problems in Boolean algebra. The results of Shannon's scientific work form the basis of the theory of digital networks for computers. But few in the West know that two years earlier, in 1935, Russian logician Viktor Shestakov put forward a similar theory of ladder circuits based on Boolean algebra, but he did not publish his work until 1941, four years after Shannon. Neither Shannon nor Shestakov knew anything about each other's work.

The first electronic computer in continental Europe was created in secrecy in 1948–1951 in a place called Feofaniya near Kyiv. Before the revolution, there was a monastery here, surrounded by oak forests and flowering meadows, abundant with berries, mushrooms, and wild animals and birds were found here. IN early years During the Soviet era, a psychiatric hospital was located in the monastery buildings. Converting religious institutions into research or medical institutions was a fairly common practice in the Soviet state. During World War II, all of the hospital's patients were killed or disappeared, and the buildings were destroyed. In spring and autumn, the road to this place was so damaged that it was impossible to drive along it. And in good weather we had to bounce over bumps. In 1948, the dilapidated buildings were handed over to electrical engineer Sergei Lebedev to create an electronic computer. In Feofaniya, Lebedev, 20 engineers and 10 assistants developed the Small Electronic Computing Machine (MESM) - one of the fastest computers in the world, which had many interesting characteristics. Its architecture was completely original and did not resemble the architecture of American computers, which were the only ones in the world that were superior to it at that time.

“He usually took his papers and the candle into the bathroom, where he spent hours writing ones and zeros.”

Alisa Grigorievna Lebedeva about the life of her husband, the founder of computer technology in the USSR Sergei Lebedev, in Moscow in 1941 during the bombing of German aircraft.

Sergei Lebedev was born in 1902 in Nizhny Novgorod (later renamed Gorky, not so long ago his former historical name was returned to him). His father was a school teacher, he was often transferred from place to place, so Sergei spent his childhood and youth in different cities, mainly in the Urals. Then his father was transferred to Moscow, and there Sergei entered the Moscow Higher Technical School named after Bauman, known today as the Moscow State Technical University named after N.E. Bauman. There Lebedev became interested in high voltage technology, an area that required good mathematical training. After graduation, he worked as a teacher at Bauman University, studying research work in the Laboratory of Electrical Networks. Lebedev was an avid climber and later named one of his computers after Europe's highest peak, Elbrus, which he successfully conquered.

In the late 1930s, Lebedev became interested in the binary number system. In the fall of 1941, when Moscow was plunged into complete darkness, fleeing fascist air raids, his musician wife recalled that “he usually took his papers and a candle to the bathroom, where he spent hours drawing ones and zeros.” Later during the war he was transferred to Sverdlovsk (now Yekaterinburg), where he worked for the military industry. Lebedev needed a computer capable of solving differential and integral equations, and in 1945 he created Russia's first electronic analog computer. At the same time, he already had the idea of ​​​​creating a digital computer based on the binary number system. Interestingly, as far as we know, at that time he was not familiar with the scientific developments in this area of ​​either his compatriot Shestakov or the American Claude Shannon.

Mastering the first personal computers at the Department of Electrical Systems and Networks of St. Petersburg State Polytechnic University

In 1946, Lebedev was transferred from Moscow to Kyiv, where he began work on a computer. In 1949, Mikhail Lavrentyev, a leading mathematician and member of the Academy of Sciences of the Ukrainian SSR, who was familiar with Lebedev's work, wrote a letter to Stalin asking him to support work in the field of computer technology, while emphasizing its importance for the country's defense. Stalin instructed Lavrentyev to create a laboratory for modeling and computer technology. Lavrentiev invited Lebedev to head this laboratory. Lebedev now has funding and status. At the same time, Stalin's order demonstrated the role of political power—indeed, the importance of one man—in advancing technology in the Soviet Union.

Lebedev developed MESM just three or four years after the creation of the world's first electronic computer, ENIAC, in the USA and simultaneously with the British EDSAC. By the early 1950s, MESM was being used to solve problems in nuclear physics, space flight, rocketry, and electrical power transmission.

In 1952, following the creation of MESM, Lebedev developed another computer - BESM (short for Large (or High-speed) Electronic Computing Machine). It was the fastest computer in Europe, at least for some period, capable of competing with the world's best developments in this field. It was a triumph. BESM-1 was produced in a single copy, but subsequent models, especially BESM-6, were produced in hundreds and used for different purposes. Production of BESM-6 was discontinued in 1987. In 1975, during the joint Soyuz-Apollo space project, Soviet specialists processed the parameters of the Soyuz orbit on BESM-6 faster than the Americans.

But after such a promising start in the field of computing, Russia today lags behind the industry leaders. The reason for this failure can only be understood by analyzing the history of the industry, taking into account the social and economic factors that influenced its transformation. In leading Western countries, the field of computing after World War II was shaped by three main driving forces: the scientific community, the state (in terms of military applications), and the business community. The role of the scientific community and government was especially important at the initial stage, the role of business emerged later. The field of computer technology in the Soviet Union was successful as long as the development of these devices primarily depended on the achievements of scientific thought and government support. Government support for computing technology was unlimited if it was used for air defense or nuclear weapons research. However, then business became the main driving force in the West. Symbolically, this transition point is General Electric's decision in 1955 to purchase IBM 702 computers to automate payroll and other paperwork at its Schenectady plant and Bank of America's decision in 1959 to automate processes (using the ERMA computer created at Stanford research institute).

"The concept of cybernetics contradicts Marx's theory of dialectical materialism, and characterized computer science as a particularly harmful attempt by Western capitalists to make more profit by replacing workers"

These decisions marked the beginning of large-scale computerization of the banking and business sphere. In the 1960s and 1970s, electronic computers became commercial products, bringing with them cost reductions and improvements in ease of use that the market demanded. The Soviet Union, with its planned economy and centralized, non-competitive market, could not keep up with the technological improvements taking place. As a result, in the 1970s the USSR abandoned its initially impressive attempt to develop its own independent course in computing and adopted IBM standards. From now on in the area computer technology The Russians found themselves and continue to remain in the position of catching up and never again became leaders. Sergei Lebedev died in 1974. Another leading scientist, the developer of the first Soviet computers, Bashir Rameev, deeply regretted the decision to adopt the IBM architecture until his death in 1994. The Soviet computing industry was not brought down by a lack of knowledge in this area, it was brought down by the irresistible force of the market.

Another factor, although not decisive in this particular case, was ideology. In the 1950s, Soviet ideologists were very skeptical about cybernetics and called it “the science of obscurantists.” In 1952, a Marxist philosopher branded the field a "pseudoscience," questioning the claim that computers could help explain human thought or social activity. In another article published a year later, entitled "Who Does Cybernetics Serve?", the anonymous author, writing under the pseudonym "Materialist", argued that the concept of cybernetics contradicted Marx's theory of dialectical materialism, and characterized computer science as a particularly harmful attempt by Western capitalists to extract more. profits by replacing wage workers with machines.

Although such ideological accusations could theoretically have a negative impact on the development of computer technology in the USSR, the development of computers, given the interest of the military-industrial complex in them, continued at the same pace8. As one of the Soviet scientists in this field told me in 1960, “We were doing cybernetics, we just didn’t call it cybernetics.” Moreover, in the late 1950s and early 1960s, cybernetics took a 180-degree turn in the Soviet Union and began to be extolled as a science serving the purposes of the Soviet state.

In 1961, a collection was even published entitled “Cybernetics - at the service of communism.” Cybernetics departments have opened in many Russian universities. A more serious political threat to the development of computing technology in the USSR arose with the advent of personal computers. The Soviet leadership liked computers while they were huge units in central government, military and industrial departments, but they were much less enthusiastic when computers moved into private apartments and ordinary citizens were able to use them to disseminate information uncontrollably. In an attempt to exercise control over the transfer of information, the state has long prohibited ordinary citizens from owning printers and copying machines. A personal computer with a printer was equivalent to a small printing press. But what could the Soviet authorities do about this?

The most heated debates among members of the Soviet leadership over computers occurred in the mid and late 1980s. In 1986, I discussed this problem with the leading Soviet scientist in this field, Andrei Ershov. He was frank, agreeing that the Communist Party's desire to control information was hindering the development of the computer industry. Then he said the following: “Our leadership has not yet decided what a computer looks like: a printing press, a typewriter or a telephone, and much will depend on this decision. If they decide that computers are like printing presses, they will want to continue to control the industry the same way they currently control all printing presses. Citizens will be prohibited from buying them; they will only be in institutions. On the other hand, if our leadership decides that computers are like typewriters, they will be allowed to be owned by citizens, the authorities will not seek to control every device, although they may try to control the dissemination of information that is produced with their help. And in the end, if management decides that computers are like telephones, most citizens will have them and they can do whatever they want with them, but online data transmission will be checked from time to time.

“Today in Russia there is not a single computer manufacturer that is a significant player in the international market, despite the fact that the Russians can rightfully claim that they were among the pioneers in the field.”

I am convinced that eventually the government will have to allow citizens to own and control personal computers. Moreover, it will become obvious that personal computers are not like any previous communication technology: not like printing presses, not like typewriters, not like telephones. On the contrary, they are a completely new type of technology. The time will soon come when any person anywhere in the world will be able to communicate almost continuously with any other person anywhere in the world. This will be a real revolution - not only for the Soviet Union, but for you too. But here its consequences will be most significant.”

This statement clearly confirms what complex problem for the Soviet state there were computers. However, this issue quickly lost its relevance. Five years after this conversation with Ershov, the Soviet Union collapsed, and with it control over communication technologies ceased (however, this did not affect control over the media, in particular over television). In modern Russia, the computer industry has never caught up with the lag that it experienced in the last years of the Soviet state. As we have seen, this lag was caused more by an inability to compete in the market than by political control, although the latter played a role. Today in Russia there is not a single computer manufacturer that is a significant player in the international market, despite the fact that the Russians can rightfully claim that they were among the pioneers in the development of computing technologies.

From Apple) creates a personal computer and receives a patent for it!

Did you know that the world's first personal computer was created not by Steve Jobs and Steve Wozniak in a Palo Alto garage, but by a simple Soviet designer Arseny Anatolyevich Gorokhov at the Omsk Research Institute of Aviation Technologies?

Let's rewind time.

1950s. Computers are huge, bulky and expensive. The Soviet "Whirlwind" of 1951, the first machine with data output on a screen, has RAM just in 512 bytes, occupies a two-story house. American "peer" - Univac– has a magnetic metal tape drive, a high-speed printer, but weighs 13 tons and costs about $1.5 million. Bendix G-15, released in 1956, is called a mini-computer - in fact it weighs 450 kg and costs at least $50,000. Not a single car deserves the title of personal car.

1960s. Computers are becoming faster, more powerful, and more compact. The first commercial computer equipped with a keyboard and monitor is released in the USA - "PDP-1". The dimensions of the new device are the size of three refrigerators, the price is tens of times lower than the cost of a regular large computer. A big step forward, but not enough for the widespread introduction of technology. Total only 50 copies were sold.

The first “home” computer claims to be Honeywell Kitchen Computer, introduced in the US in 1969. It weighed about 65 kg, cost 10600$ , was a pedestal with a built-in cutting board, a panel of lights and buttons. It performed only one function - storing various recipes. Working with the “kitchen computer” required two weeks of training because recipes were displayed on the screen in binary code. There were no people willing to purchase such an expensive “cookbook”.

1970s. With the creation of the first microprocessor, the era of personal computers begins. Inventors compete to build their own models. American entrepreneur Edward Roberts is the first to realize how great the potential of the 8-bit microprocessor is. Intel 8080, released in 1974, and creates a microcomputer based on it "Altair 8800". Thanks to a deal concluded with Intel for the wholesale purchase of microprocessors ($75 per unit, with a retail price of $360), Roberts sets a record price for his invention - only 397 “buckets”! Advertising on the cover of a respected magazine "Popular Electronics" behind 1975 the year is doing its job. In the first month, developers sell several thousand copies "Altair 8800". However, the received order comes as a surprise to buyers: the kit consists of a set of parts and a box for the case. Users have to solder, test, and create programs in machine language themselves. (Which, of course, is also not bad, because it is on "Altair 8800" founders "Microsoft" Bill Gates and Paul Allen are testing their famous program"Basic").

Be that as it may, Roberts’ computer is a godsend for inventors, and “mere mortals” are still left without technology. To help them in 1976 year, Steve Wozniak and Steve Jobs come, deciding to sell their "Apple I" , assembled for personal use in a garage in Palo Alto (California). The cost of a new computer is 666,66$ . And the main advantage is that, unlike "Altair 8800" and many other cars of that time, "Apple I" offered already assembled. All you need is a case, a keyboard and a monitor. But they will also be included in the kit 2 years later, in the serial production of color, sound "Apple II". This is the history of the personal computer.

Stop, stop, stop... But what about the Soviet scientist and the Research Institute of Aviation Technologies?!

Oh yes! Completely forgot. There are in the history of personal computers and dark page.

Here is how it was. In the distant 1968 year, 8 years before the first Apple, Soviet electrical engineer Arseny Anatolyevich Gorokhov invented the car entitled “Device for specifying a program for reproducing the contour of a part.” So, in any case, it is indicated in the patent, copyright certificate № 383005 , dated May 18, 1968. The name is not accidental, because the developed device was intended primarily for creating complex engineering drawings. The inventor himself prefers to call the device a “programmable device intellect.”

According to the drawings, the “intellector” had a monitor, a separate system unit With hard drive, a device for solving autonomous problems and personal communication with a computer, motherboard, memory, video card and other things, with the exception of a computer mouse.

Omsk electromechanical engineer Arseny Gorokhov 45 years ago invented a device that is now called a Personal Computer

According to the website “Omsk Time”, today, unfortunately, it is impossible to see the world’s first personal computer, the institution where it was created is “ Mailbox» Omsk Research Institute of Aviation Technologies, closed for several years. The author of the invention still has patent, with description "Programmable intellect device" and an entry in the Russian book of records DIVO: 45 years ago in 1968, Omsk electromechanical engineer Arseny Gorokhov invented a device that is now called a Personal Computer.

Now Gorokhov uses his personal computer mainly as a typewriter. According to him, it was new 5 years ago, and to do an “upgrade”, that is, to modernize, is expensive, the pension will not be enough.

The components of a modern computer - a monitor, a system unit, a keyboard - were also in Gorokhov’s “intellector,” although under different names. The device was intended primarily for creating complex engineering drawings. Gorokhov also developed his own “software” - a way to dialogue with a machine without thick packs of punched cards and a team of programmers. But further All-Union patent things didn’t work out - the invention was not given the green light, and in 1975 they learned that the term “personal computer” was given to the world by the American company Apple.

40 copyright certificates and patents of Arseny Gorokhov over three decades are just moral satisfaction from his work. Traces of material remained in the patent records - 20 rubles for each, not included in the series. If a new product was still allowed to make its way into the “series,” the author received 1000 times more. Just to recognize the mysterious "law of luck" The inventor did not always succeed. And now Gorokhov calculates the probable profits from the opposite, not “how much they received, but how much they could not.”

“It is not oil that is the future of Russia, but inventors”- the leitmotif of Gorokhov’s next article, “The System for Accelerated Development of Inventions,” published in the last, 12th, 2003 issue of the magazine “Intellectual Property.” It is a pity that in Russia there is no practice, like in the United States, where the President meets with the head of the Patent Office twice a year. Increasingly, instead of a sense of pride, one has to use irony, says the author. Prospects are fading.

Now on the inventor’s desktop - the new kind periodic table, and preparation for spatial television. It’s just that there were no people interested in the idea, except for rare guest journalists.

About the invention cell phone article “The Mystery of the Cell”...

At the end of the 19th century, Herman Hollerith in America invented counting and punching machines. They used punched cards to store numerical information.

Each such machine could perform only one a specific program, manipulating punch cards and numbers punched on them.

Counting and punching machines performed perforation, sorting, summing, and printing numerical tables. These machines were able to solve many typical problems. statistical processing, accounting and others.

G. Hollerith founded a company producing counting and punching machines, which was then transformed into a company IBM- now the world's most famous computer manufacturer.

The immediate predecessors of computers were relay computing machines.

By the 30s of the 20th century, relay automation was greatly developed , which allowed encode information in binary form.

During the operation of a relay machine, thousands of relays switch from one state to another.

In the first half of the 20th century, radio technology developed rapidly. The main element of radio receivers and radio transmitters at that time were electron vacuum tubes.

Electron tubes became the technical basis for the first electronic computers (computers).

The first computer - universal machine on vacuum tubes built in the USA in 1945.

This machine was called ENIAC (stands for: Electronic Digital Integrator and Calculator). The designers of ENIAC were J. Mauchly and J. Eckert.

The counting speed of this machine exceeded the speed of relay machines of that time by a thousand times.

The first electronic computer, ENIAC, was programmed using the plug-and-switch method, that is, the program was built by connecting individual blocks of the machine with conductors on a switchboard.

This complex and tedious procedure for preparing the machine for work made it inconvenient to use.

The basic ideas on which computer technology developed for many years were developed by the greatest American mathematician John von Neumann

In 1946, the journal Nature published an article by J. von Neumann, G. Goldstein and A. Burks, “A Preliminary Consideration of the Logical Design of an Electronic Computing Device.”

This article outlined the principles of the design and operation of a computer. The main one is the principle of stored program, according to which the data and the program are placed in the general memory of the machine.

A fundamental description of the structure and operation of a computer is usually called computer architecture. The ideas presented in the above-mentioned article were called “J. von Neumann’s computer architecture.”

In 1949, the first computer with Neumann architecture was built - the English EDSAC machine.

A year later, the American EDVAC computer appeared. The named machines existed in single copies. Serial production of computers began in developed countries in the 50s.

In our country, the first computer was created in 1951. It was called MESM - small electronic calculating machine. The designer of the MESM was Sergei Alekseevich Lebedev

Under the leadership of S.A. Lebedev in the 50s, serial tube computers BESM-1 (large electronic calculating machine), BESM-2, M-20 were built.

At that time, these cars were among the best in the world.

In the 60s, S.A. Lebedev led the development of semiconductor computers BESM-ZM, BESM-4, M-220, M-222.

The BESM-6 machine was an outstanding achievement of that period. This is the first domestic and one of the first computers in the world with a speed of 1 million operations per second. Subsequent ideas and developments by S.A. Lebedev contributed to the creation of more advanced machines of subsequent generations.

Electronic computer technology is usually divided into generations

Generational changes were most often associated with changes in the elemental base of computers and with the progress of electronic technology.

This has always led to an increase in the computing power of the computer, that is, speed and memory capacity.

But this is not the only consequence of generational change. With such transitions, significant changes occurred in the computer architecture, the range of tasks solved on a computer expanded, and the method of interaction between the user and the computer changed.

First generation of computers - tube machines from the 50s. The counting speed of the fastest machines of the first generation reached 20 thousand operations per second (M-20 computer).

Punched tapes and punched cards were used to enter programs and data.

Since the internal memory of these machines was small (it could hold several thousand numbers and program commands), they were mainly used for engineering and scientific calculations not related to the processing of large volumes of data.

These were rather bulky structures containing thousands of lamps, sometimes occupying hundreds of square meters, consuming hundreds of kilowatts of electricity

Programs for such machines were compiled in machine command languages. This is quite labor-intensive work.

Therefore, programming in those days was available to few.

In 1949, the first semiconductor device, replacing a vacuum tube. It was called a transistor. Transistors were quickly introduced into radio technology.

Second generation of computers

In the 60s, transistors became the elemental base for computers second generation.

The transition to semiconductor elements has improved the quality of computers in all respects: they have become more compact, more reliable, and less energy-intensive

The speed of most machines has reached tens and hundreds of thousands of operations per second.

Volume internal memory increased hundreds of times compared to first-generation computers.

External (magnetic) memory devices have received great development: magnetic drums, magnetic tape drives.

Thanks to this, it became possible to create information, reference and search systems on a computer.

Such systems are associated with the need to store large amounts of information on magnetic media for a long time.

During the second generation Programming languages ​​began to actively develop high level. The first of them were FORTRAN, ALGOL, COBOL.

Compiling a program no longer depends on the car model; it has become simpler, clearer, and more accessible.

Programming as an element of literacy has become widespread, mainly among people with higher education.

Third generation of computers was created on a new element base - integrated circuits. Using very sophisticated technology, specialists have learned to mount quite complex electronic circuits on a small wafer of semiconductor material, less than 1 cm in area.

They were called integrated circuits (ICs)

The first ICs contained dozens, then hundreds of elements (transistors, resistances, etc.).

When the degree of integration (number of elements) approached a thousand, they began to be called large integrated circuits - LSI; then ultra-large-scale integrated circuits (VLSI) appeared.

Third-generation computers began to be produced in the second half of the 60s, when the American company IBM began producing the IBM-360 machine system. These were IS cars.

A little later, machines of the IBM-370 series, built on LSI, began to be produced.

In the Soviet Union in the 70s, the production of machines of the ES series of computers began ( One system Computer) based on the IBM-360/370 model.

Transition to the third generation associated with significant changes in computer architecture.

It became possible to run several programs simultaneously on one machine. This mode of operation is called multiprogram (multi-program) mode.

The operating speed of the most powerful computer models has reached several million operations per second.

On third-generation machines, a new type of external storage devices appeared - magnetic disks .

Like magnetic tapes, disks can store an unlimited amount of information.

But the drives are magnetic disks(NMD) work much faster than NML.

New types of I/O devices are widely used: displays, plotters.

During this period, the areas of application of computers expanded significantly. Databases and the first systems began to be created artificial intelligence, computer-aided design (CAD) and control systems (ACS).

In the 70s, the line of small (mini) computers received powerful development. The machines of the American company DEC PDP-11 series have become a kind of standard here.

In our country, a series of SM computers (System of Small Computers) was created based on this model. They are smaller, cheaper, and more reliable than large cars.

Machines of this type are well suited for the purpose of controlling various technical objects: production plants, laboratory equipment, vehicles. For this reason they are called control machines.

In the second half of the 70s, the production of minicomputers exceeded the production of large machines.

Fourth generation of computers

Another revolutionary event in electronics occurred in 1971, when the American company Intel announced the creation microprocessor .

The microprocessor is an extremely large integrated circuit, capable of performing the functions of the main unit of a computer - the processor

Microprocessor- this is a miniature brain, working according to a program embedded in its memory.

Initially, microprocessors began to be built into various technical devices: machines, cars, airplanes . Such microprocessors automatically control the operation of this equipment.

By connecting the microprocessor to I/O devices, external memory, got new type computer: microcomputer

Microcomputers are fourth generation machines.

A significant difference between microcomputers and their predecessors is their small size (the size of a household TV) and comparative low cost.

This is the first type of computers, which appeared in retail sales.

The most popular type of computer today is personal computers.

The emergence of the phenomenon of personal computers is associated with the names of two American specialists: Steve Jobs and Steve Wozniak.

In 1976, their first production PC, Apple-1, was born, and in 1977, Apple-2.

The essence of what a personal computer is can be briefly formulated as follows:

A PC is a microcomputer with user-friendly hardware and software.

The PC hardware kit uses

    color graphic display,

    mouse-type manipulators,

    "joystick",

    comfortable keyboard,

    User-friendly compact disks (magnetic and optical).

Software allows a person to easily communicate with the machine, quickly learn the basic techniques of working with it, and benefit from the computer without resorting to programming.

Communication between a person and a PC can take the form of a game with colorful pictures on the screen and sound.

It is not surprising that machines with such properties quickly gained popularity, and not only among specialists.

The PC is becoming as common a household appliance as a radio or TV. They are produced in huge quantities and sold in stores.

Since 1980, the American company IBM has become a trendsetter in the PC market.

Its designers managed to create an architecture that actually became international standard for professional PCs. The machines in this series were called IBM PC (Personal Computer).

In the late 80s and early 90s, Macintosh machines from Apple Corporation became very popular. In the USA they are widely used in the education system.

The emergence and spread of the personal computer in its significance for social development is comparable to the advent of book printing.

It was PCs that made computer literacy a mass phenomenon.

With the development of this type of machine, the concept of “information technology” appeared, without which it has become impossible to do without in most areas of human activity.

There is another line in the development of fourth-generation computers. This is a supercomputer. Machines of this class have speeds of hundreds of millions and billions of operations per second.

The first supercomputer of the fourth generation was the American machine ILLIAC-4, followed by CRAY, CYBER, and others.

Among the domestic machines this series includes the ELBRUS multiprocessor computing complex.

Fifth generation computer These are cars of the near future. Their main quality should be a high intellectual level.

Fifth generation machines are realized artificial intelligence.

Much has already been practically done in this direction.

In 1673, the German scientist Gottfried Leibniz built a calculating machine that could perform all four arithmetic operations at once: addition, subtraction, multiplication and division. It later became the prototype of the adding machine.

In the 19th century, they continued to improve the operation of the adding machine. As a result of additional developments, the accuracy and reliability of the calculations has been increased.

In Russia, the first calculating machine was developed by P. Chebyshev in 1878. On this device it was possible to subtract and add multi-digit numbers at once.

A mechanic from St. Petersburg, Odner, added a gear to the adding machine, which had a variable number of teeth. As a result, this design ensured fast and accurate execution arithmetic operations. This was the impetus for the further widespread distribution of adding machines, which were produced for another hundred years.

Only in the 30s of the 20th century they designed the perfect model of an adding machine - “Felix”.

In the 19th century, English-born mathematician Charles Babbage developed the main principles that were to form the basis for the design of a computer:

  • arithmetic device;
  • memory;
  • input/output device;
  • control device.

But he was unable to realize his plans, since mechanical engineering at that time was at a low level. However, his daughter, Ada Lovelace, created computer programs for this device. She is rightfully considered the first programmer in the world.

Modern history of computers

In the 40s of the 20th century, a programmable calculating machine based on an electromechanical relay was created. But it did not become widespread because the development of computers based on radio tubes began.

Officially, the year of creation of the first computer is considered to be 1946, when American scientists developed the first Eniak machine. The principles of computer construction were formulated by John von Neumann, who is considered the creator of the first electronic computer.

But some researchers attribute the creation of the first computer to the German inventor Konrad Zuse, who created the programmable binary computer “Z1”. The creation of such a machine was completed in 1938.