Discoveries from brain-computer convergence

In the first of four articles on the implications of the convergence of computing, biogenetics and cognitive neuroscience, Charles Ross and Max Jamilly look at how computers can help us extend our individual powers of learning and understanding and move us towards prosthetic brains.

The article ‘Convergence of Computing, Biogenetics and Cognitive Neuroscience’ in the autumn edition of ITNOW stimulated considerable reaction. The Cybernetics Machine specialist group (CMSG) was relaunched in January 2016 to begin to explore this convergence. As Peter Marcer says, ‘Artificial and human intelligence and their relation to the genetic code offer enormous opportunities for new research and innovation.’

In conjunction with the Brain Mind Forum of the Real Time Club, the CMSG are launching a number of student groups with the University of Oxford, King’s College London and the New College of the Humanities (at the time of going to press) to help students in a wide spectrum of subjects to explore how computing will affect or even dominate their careers.

Computing professionals have begun to open their minds to the exciting and relevant possibilities of brain-mind convergence. In March 2015 at a Royal Institution lecture, sponsored by the Real Time Club, Professor Daniel Dennett, arguably the world’s leading philosopher of cognitive neuroscience, sparked an interdisciplinary debate on the novel potential of our computers: ‘How can we communicate directly with our computers as partners to extend everyone’s personal ability to learn, understand, master complex topics and even learn to be more intelligent?’

The computing profession is proud of the recent developments in artificial intelligence (AI), in particular high profile successes like beating chess masters and winning the quiz game Jeopardy, but, dramatic as these achievements are, we are only using our most advanced computers as we have used all our inventions down the centuries - as tools. They do not improve our brains or memories.

Famous tools of antiquity like the plough and the wheel did not enhance mankind’s muscles. Neither did the engines that created the industrial revolution. We can travel ever faster in our cars; fly in our aeroplanes, but we have not evolved faster running muscles, or sprouted the wings to fly. Microscopes and telescopes allow us to see ever more, but have not improved our personal eyesight. Few would argue that we are more intelligent than Aristotle, Plato, Socrates, Pythagoras, Archimedes, Eratosthenes, Hero and many others; or stronger than the first Olympians.

However, for the first time in civilisation we have the prospect of designing our computer systems to directly improve our own personal mental faculties, to help everyone to think, to be creative, and make better decisions. This is a potential paradigm shift.

There has been a growing trend in the media to disparage computers, depicting AI as an evil force that could take over the world - the dreaded ‘singularity’. Yet this revolution in our understanding of how computers and our brains can interact together presents a new and exciting future.

For now, let us carefully compare and contrast computing and cognitive neuroscience. Both can learn a remarkable amount from each other. The brain mind and computers are the only two information processing systems known in the world. Our cooperation can help open a new epoch, not just in computing, but potentially in our civilisation.

Our best guess is that the brain mind evolved to enable all the organs of the body to operate as one single collaborating, cooperating, coordinated synchronised whole. It has evolved, therefore, to process information only as a means to this end. The brain mind is very much an extension of our complex bodies.

By contrast, our computers have been specifically designed to do one job and one job only: to process information. Computers have no other task and can do nothing else. However, they do this task superbly, and we have only just begun to appreciate their potential.

The brain-mind

Over the past century we have begun to learn a great deal about the brain. We are born with some two billion neurons: one billion connecting up every organ, muscle and gland in our bodies, with concentrations around the major organs like the heart and in the gut; and one billion in the brain.

There are many types of neuron, but they all follow the same architecture of a nucleus that contains its own energy generating mitochondria and temporary storage system (ATP), and two types of filaments: the dendrites that input signals from all over the body, like the ears, eyes and other sensory organs, and the axons that send signals to organs all over the body, like the muscles and glands. All these filaments are connected by unique links, which consist of a small gap, known as a synapse. Signals are carried across these gaps by neurotransmitters.

The strength of all these signals varies according to the volume and mix of hormones output by the glandular system, which is thought to account for our emotional behaviour. There is a family of glia cells and the whole system is permanently bathed in a mixture of hormones. There is rising speculation that the width of the synaptic gaps varies according to the electrochemical tension across them; individually, in networks and the whole brain; which may have a direct impact, both on building memory structures, and possibly even account, in part, at least, for aspects of concentration and consciousness.

Over this whole physical system flow streams of patterns of electrochemical signals. These patterns alert the brain to the events in the outside world monitored by the sensory organs.

The brain has evolved to respond to incomplete information as quickly as possible and to send streams of electrochemical signals to the muscles - for survival and reproduction, and to the glands to generate conscious sensations like hunger, fear, aggression and arousal. The electrochemical activity through the neuron filaments generates waves of electromagnetic fields, which it is thought play a key role in memory formation and learning.

Hardware and software

Unconscious body processes are all coordinated by the autonomic nervous system, which is continuously active, even under deep anaesthesia. If all electrochemical activity ceases the heart stops. Thus at this level, the architecture of the brain is remarkably similar to that of our computers.

The physical neurons, glia, neurotransmitters and hormones that we can see and touch are the hardware, or brain. The ephemeral streams of electrochemical signals are the software, or mind. The ancient Egyptian’s called the mind ‘Ka’. It is an ancient idea that there exists a duality between mind and brain. Our computers are showing us how this operates.

Did we invent maths or discover it? It could be that by inventing it, we discovered something that already existed. The same may be true of software: it was only when we had invented computer software that we saw its parallels with the mind, with biogenetics - and much else. The essence of software is that it enables one set of ‘hardware’ to carry out an infinite number of different tasks. For the same reason, the brain’s capacity is almost limitless. There is a strong argument that we are on the cusp of a major new branch of knowledge: the science of software.

Memory in computers and the brain-mind

Just as our computer hardware can do nothing without a program, or software; so the hardware of the brain can do nothing without streams of electrochemical signals, the software, the mind.

We are used to the concepts of programs held in memory being executed by a processor, which calls all the shots. However, the brain-mind has no processor: the whole brain-mind is its own processor and the complete system is entirely driven by the inputs from the sensory and other organs.

At birth, we can do almost nothing, but we can learn to do almost anything. Initially, the autonomic system connects up all the organs, but little more. For instance, young babies can make noises and hear sounds, but not one single child has ever been able to say one single word. Curiosity, imitation and endless repetition enable babies to learn the words of their parents and peers.

Neurons activated from hearing the sound ‘mother’ and seeing the image of ‘mother’ and perhaps her smell and taste and touch all generate electromagnetic fields. Where the magnetic fields of these dendrites and axons overlap, they attract glia cells, which form tentative links.

Repetitive usage strengthens these ‘glia bridges’, and in due course insulates them (myelinates them), building new robust neural structures that forever link the images, sounds, smells, tastes and touch of ‘mother’ and ultimately all our experiences with ‘mother’, including saying the word, decoding the sound we can hear, and the images we can write and read of ‘mother’. We literally grow the neural links and structures. We grow memory.

In July 2013, the Brain-Mind Forum carried out some experiments at the Royal Institution, which demonstrated that minimal records of every visual experience, seen for at least 20 seconds, are automatically stored by the brain, even if we are not always aware of them. In fact, we can now show that electrochemical energy flowing along active neurons causes links to form between them. ‘Neurons that fire together wire together,’ as Donald Hebb said so succinctly.

The information encapsulated in these new structures is ‘potential information’ or memory. Whenever any of these new structures are stimulated, they generate a pattern of electrochemical signals identical to the pattern that created them in the first place. Flowing through the neural networks, this ‘kinetic information’ stimulates muscular, hormonal and neural activity. We can write programs to emulate learning, but the brain-mind’s biological learning system is unique. A mature brain grows a trillion or more of these new links and structures.

No computer can grow additional hardware, yet on the horizon we can begin to envisage how the technology of three dimensional printing and synthetic biology could enable us to grow neural ‘brains’, linking them like prosthetic limbs to our neural systems. That is what this series of articles and student clubs are all about.

Representation of information

Computer technology has also been of immense use to cognitive neuroscience by clarifying how information is represented in the brain, which is a subject that has puzzled people from the beginning of recorded history. It comes as some surprise, even to some computer professionals, that there has not been one single word, image or sound in any computer even built.

Computers can only process patterns of digital signals, often called ‘bits’. We developed this digital standard from the original Morse code used by the first telegraphs. Every time our WP programs use a word, they create a new record of it in the computer’s memory systems.

It comes to many as an even bigger surprise that there are no letters, numbers, pictures or sounds, let alone words in any of our heads either. The brain-mind has no such coding system for words, or anything else, and we do not have the luxury of replicating a new record of every word every time we use it.

Neural signals are analogue; their meaning depends on the strength, frequency and positioning as much as their pattern. In addition, the whole signalling system is heavily influenced by the environment.

What we learn to recognise as words (and images, sounds, tastes and so on) start their life as the pattern of conscious sensations we experience on first encounter. Every subsequent experience adds networks of these patterns. Every word is linked to the patterns of sensations of every word associated with it, which develops their meaning to us.

As we noted above, every word is linked to the neural instruction patterns to ‘say’ that word and manipulate the finger muscles to ‘write’ it, and the visual system to recognise or ‘read’ that word; how to spell it, and its phonemes and syllables et al. This is multiplied by the usage of every word in multiple phrases.

For instance, if we learn a poem, we do not separate the words we need into a separate ‘record’, but grow a neural circuit linking all the word structures in the correct sequence. In effect the poem is the neural network that accesses the string of words that enables us to recite the poem, write it, or just think about it.

Summary

Perhaps the greatest breakthrough in cognitive neuroscience that computing has helped engineer is the definition, differentiation and implications of the brain and the mind as hardware and software. The second breakthrough is an improved, although still modest, understanding of what information is and how it is represented in the brain-mind.

Thirdly, we have come to appreciate the sheer complexity of inputs affecting the function of the brain and mind. Cognitive neuroscience traditionally concentrates on the brain-mind as an information processor, and has increasingly come to think of the brain in isolation, like a computer processor. However, the human brain and mind are interwoven with several other complex organ systems and they continuously interact. 

There is the immune system with its own reciprocally linked but independent information and memory systems. While the neural networks transmit signals, the cardiovascular system operates the transport system delivering vital supplies like nutrients, oxygen and hormones, which directly affect neural activity.

Then there is the autonomic system, the hormonal system, the fuel and energy generating and storage system, not to forget the reproductive and reconstruction systems. The more we understand the brain-mind, the more we can develop ways of using our computers to help us extend our individual ability. In the next three articles we will explore the systems already being developed to help us learn and understand, think, solve problems and make better decisions, and the potential of what is possible.

Then we will explore the potential towards developing artificial prosthetic brains: the convergence of computers, engineering and biology. In the last article we will explore the future of human-machine symbiosis.

Brain-computer convergence four-part series:

  1. A key to the human brain: discoveries from brain-computer convergence
  2. Better together: how computers can be designed to augment human ability
  3. Towards artificial prosthetic brains: the convergence of computers, engineering and biology
  4. The future of human-machine symbiosis