All computers, beginning with Babbage’s unconstructed “analytical mach-
and ending with the Cray, are based on the very same principles. From
the logical point of view a computer consists of bits (variables, taking the
values 0 and 1), and a program that is, a sequence of operations, each
using some bits. Of course, the newest computers work faster than old ones,
but progress in this direction is limited. It is hard to imagine that the size
of a transistor or another element will ever be smaller than
cm (the
diameter of the hydrogen atom) or that the clock frequency will be greater
Hz (the frequency of atomic transitions), so that even the super-
computers of the future will not be able to solve computational problems
having exponential complexity. Let us consider, for example, the problem
of factoring an integer number x into primes. The obvious method is to at-
tempt to divide x by all numbers from 2 to

x. If x has n digits (as written
in the binary form), we need to go through

trials. There ex-
ists an ingenious algorithm that solves the same problem in approximately
steps (c is a constant). But even so, to factor a number of a mil-
lion digits, a time equal to the age of the Universe would not suffice. (There
may exist more effective algorithms, but it seems impossible to dispense
with the exponential.)
There is, however, another way of speeding up the calculation process
for several special classes of problems. The situation is such that ordinary
computers do not employ all the possibilities that are offered to us by nature.
Babbage began his work on the “analytical machine” project in 1833. In contrast
to calculating devices already existed at the time, his was supposed to be a universal computer.
Babbage devoted his whole life to its development, but was not successful in realizing his dream.
(A simpler, nonuniversal machine was partially constructed. In fact, this smaller project could
have been completed in 1991 the machine was produced in accordance with Babbage’s design.)
Previous Page Next Page