Technological progress comes from pushing hard at the limits of what is currently possible, not from merely following trends others have set. In computing a good illustration of this principle is the life and work of the 19th century computer pioneer Charles Babbage (1791-1870), who spent most of his adult life trying to build a digital computer. Babbage first invented such a machine in 1834. He called it the Analytical Engine.
Designed to contain tens of thousands of cogwheels, the Analytical Engine was never built, mainly because the precision engineering industry of Babbage's day couldn't furnish sufficient numbers of exactly-machined cogwheels at a reasonable cost and within practical time constraints.
All the same, by thinking way beyond what was feasible at the time, Babbage certainly designed a machine that was to all intents and purposes a computer. The Analytical Engine had a memory, a processor and even a punched-card data storage and programming system. Computers of today work infinitely faster than even Babbage could have imagined, but ultimately they are all essentially a form of Analytical Engine.
The future
How will computers work in the future - in around 2020, for example - and where is computing going?
According to the famous law formulated in 1965 by Gordon Moore, the founder of Intel, the number of transistors on individual microprocessors will double every eighteen months.
Experience gained over the past 20 years of building computers suggests that Moore's Law holds good. Extrapolating it into the future, to 2020, for example (a year that, like all future dates that once seemed so remote, will come with astonishing haste), suggests that by then we'll have reached a point where microprocessors will have attained an atomic level. This is another way of saying that microprocessors will have become as small and compact as they are ever likely to do.
On the face of it, when microprocessors reach the atomic stage that will be the end of the evolution of computers. The machine that Charles Babbage first imagined in 1834 will have reached a dead end.
Or will it? Babbage's dream of an Analytical Engine only became a reality after its inventor's death, when the new technology of electronics provided a way to build a machine that did everything, and more, that Babbage envisaged.
Similarly, a growing circle of computer scientists is coming to believe that another new technology may provide a way to build a completely new generation of computers once conventional electronic computing has reached the point of diminishing returns: quantum computing.
Tiny bits
Quantum computing exploits the curious effects described in the science of quantum mechanics, which studies the behaviour of energy and matter at an atomic level. The effects of quantum mechanics are in fact present in our everyday lives, but they are not observable (or at least obvious) at macroscopic scales.
However, when we are dealing with processes that take place at atomic and subatomic levels, the consequences of quantum mechanics are very important and can give rise to opportunities that are important even to people who focus on everyday, macroscopic experiences.
Quantum mechanics is a complex science, but at its heart is a fairly straightforward principle: that certain kinds of energy travel at an atomic level in little packages of energy (or 'quanta') rather than as continuous streams. Light, for example, travels in this way; the quanta of light are known as photons.
But light doesn't only travel in quanta: it also travels as a wave, and this gives rise to a phenomenon known as particle-wave duality. There is no need here to go into the complex physics of this, what matters are its implications for quantum computing.
An important implication is that, according to quantum mechanics, at certain sub-atomic conditions, photons, electrons and other types of energy do not behave only as particles or only as waves but can exhibit properties of both. This is another way of saying that they can do two things at once.
In a classical computer, a 'bit' is a fundamental unit of information, classically represented as a 1 or 0 in a binary number stored in, or processed by, the computer in a physical system such as the magnetisation on a hard disk or the charge on a capacitor. In a quantum computer, however, the fundamental unit of information can exist not in one of two states but in one of four.
This fundamental unit of information is known as a 'quantum bit' or 'qubit'. It is not binary but quaternary in nature. The property of the qubit arises as a direct result of the implications of quantum mechanics.
The qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or what is known as a 'superposition' of these classical states. In other words a qubit can exist as a zero, a one, or simultaneously as both 0 and 1, with a numerical coefficient that represents the probability of each state existing.
Important probabilities
If this sounds complicated, it is, but current computer science would have sounded complicated even to 19th century men of genius such as Charles Babbage or Michael Faraday. Quantum computing is at present only in its infancy; we are merely at the start of understanding how to harness qubits for maximum practical utility within a computer system that does useful work.
The fact that qubits don't deal in certainties but in probabilities doesn't mean that they can't be harnessed in important ways. To take just one example: computers that play chess (arguably one of the most successful current applications of artificial intelligence) use programs that make extensive use of the analysis of the probabilities of the opponent making a certain response.
In fact, people who consider quantum computing to be the future of IT argue that the qubit liberates conventional computing from the simplistic on/off engineering of traditional computers. Quantum computing may facilitate new types of computer applications as well as replicating conventional applications and enabling us to escape the limitations of Moore's Law.
Just as the application of electronics to IT enabled the development of computers that had levels of performances far beyond what even Charles Babbage could dream, quantum computing may yield advances in computing we can, at present, hardly imagine.
Quantum computing may even be the route to providing really powerful approximations to the higher functions of the human mind, which conventional computers are notoriously bad at replicating. There is already clear evidence that quantum computing is likely to facilitate the creation of levels of computer security way beyond even what is currently achievable.
Overall, there are plenty of reasons to believe that quantum computing may be the future of computing. And, as anyone who has worked in computing over the past 20 years or so will testify, the future of computing has a habit of coming upon us with alarming speed.