The next generation of operating systems and software applications will scale new heights of capability, sophistication and complexity. Today, we call these breakthroughs expert systems, machine learning agents, or even artificial intelligence.
Though their names may change over the coming decades, one question remains. Many voices, within and outside the industry, are asking whether we are approaching a singularity - a time when our computers are cleverer than us.
In recent years, our understanding of machine intelligence has grown exponentially. There has been a parallel explosion in the science of our own human intelligence. Will there be a race between silicon circuits and biological neurons?
Cognitive neuroscientists and biogeneticists have begun to appreciate that the challenges of designing software systems - and the appreciation of complexity that coding brings - could help shed light on their research.
Chemists, endocrinologists, and immunologists have joined in this convergence of thinking, as the leaders in all these fields have begun to realise that learning, thinking, intelligence and consciousness are not just functions of the brain and mind in isolation, but an integral part of the entire nervous system.
Even the gut has recently emerged as a surprisingly powerful information processor. Information is the raw material of computers and brains. Intelligence concerns how both these systems process that information. So, what can programmers and cognitive neuroscientists learn from each other about information and intelligence?
The origins of human intelligence are buried deep in our evolutionary past. We are born with a nervous system comprising of neurons. About half are in the brain; others link our muscles, organs and sensory receptors. Streams of electrochemical signals control and coordinate many of the background functions, like breathing and digestion, that keep us alive.
These extended ‘autonomic’ functions bear a strong resemblance to the ‘operating systems’ we have designed for our computers. Intelligence emerged as how the mind processes the information from sensory organs monitoring the world.
The more intelligent an organism, the greater the complexity with which it can respond to internal and external cues to beat the competition. It can be argued that intelligence was, and still is, the central product and driver of evolution.
A key component of intelligence is memory. We are just beginning to understand how the brain grows its massive neural memory banks. When we are born we can do almost nothing, but can learn to do almost anything. A baby can hear and make sounds but must learn every single word.
The first task is to recognise and identify patterns of light, to recognise the images of parents, food, partners and threats; then patterns of touch, taste and smell; then eventually to understand patterns of sounds to hear, speak and to communicate with words.
The mature brain of an adult has grown new links and structures which hold the neural patterns to decode, write, say and think many thousands of words, perhaps in more than one language, plus all our manifold other skills. These bear strong parallels to computer applications.
Two types of intelligence
By trial, error and endless repetition, we build two types of intelligence. Intelligence as the way the system processes information - more efficient than the competition; and intelligence as information itself - better than the competition. ‘Whence came thee by this intelligence?’ Macbeth asked the witches. We have a National Criminal Intelligence Service.
Computer processing is far simpler than human intelligence. Computers can process the information that we provide to them, but they cannot grow new hardware or deal with new concepts - at least for now; we have to add both manually.
However, computers can process some kinds of information far more quickly and more efficiently than our brains, and they are only growing faster. The information that computers generate can be shared between them to form a computational network, but that information can also add to human knowledge: the two networks are linked.
Computing has come full circle. Colossus was invented and designed specifically to identify patterns in encrypted German signal traffic in the 1940s. Every baby’s first tasks are to recognise patterns of light and sound to identify images and words. There is nothing new about pattern identification as the foundation of intelligence.
Computers are already better than brains
In many ways, computers already have powers far greater than those of the brain. Information is everywhere: from meteorology to medicine, economics to ecommerce, we are surrounded by data and it is becoming more abundant by the second.
The genius of the latest breakthroughs in artificial intelligence is that computers, unlike humans, can harness a variety of mathematical tools to identify patterns in this data and turn it into information.
Much of human invention, imagination, prediction and thinking is about recognising that existing information, and, if viewed from a different perspective or combined in innovative ways, this can solve previously impenetrable problems.
Add AI as a source of new information and we could experience a massive paradigm shift, a bona fide singularity.
However, this is where another story starts…
Information is only useful to us if we understand its meaning. A list of statistics or facts, by itself, often tells us little. Seeing the grass waving on the savannah was of no value to our ancestors unless they could determine very quickly whether it indicated a predator, prey or a potential mate.
This observation is the basis of major breakthroughs of convergent technologies in recent months. We have known for some while about hormones (like adrenalin and testosterone) and neurotransmitters (such as dopamine) and long and short-range messenger molecules.
Together, chemical and electrical signals generate an almost limitless variety and combination of sensations, impressions and emotions that give meaning to music, drama, art, and, particularly, to words, numbers, shapes, relationships, formulae and algorithms.
The language of words has provided us with the essential skills of communication, but the electrical symphony of neural firing and the cocktail of hormones of the endocrine system do so much more. Are these systems simple support networks, the architecture for the programs of cognition - or do they interact in a deeper, richer way, entwined with thought and emotion down to the molecular scale?
Consider a speech or poem. List the words in alphabetical sequence and they mean little. List them as the poem or speech does, and they can kindle love, rouse war, and incite titanic emotion. The whole is far greater than the sum of the individual words.
It is this ability to think in sensations, impressions and emotions that enables us to be creative, to innovate, be imaginative, predict possible futures and solve problems: to think something new.
Doubtless, over the coming years we will learn how to design programs that can simulate anger, hunger and probably many more nuanced sensations. But without a self-organising, self-developing system needing no external power sources or tech support, computers will only ever be able to imagine what we can imagine our computers will be able to imagine. In essence, limited by what we can design into their programs.
Intelligence for both humans and computers is partly about the speed and efficiency of the system to process information. Intelligence in computers is the ability to recognise patterns in ever larger datasets and so identify new information. Human intelligence is about inventing innovative ways of processing known and new information to create new knowledge.
Computers are, arguably, the most powerful tool we have ever invented. Working in tandem with them, each concentrating on what it does best, we will produce ever more exciting revelations and hope to change the course of our civilisation. But whether the singularity is impending or distant - attainable or unreachable - is as yet unclear.
Charles Ross' obituary is printed in ITNOW, Volume 61, Issue 1, Spring 2019, Page 28.