It does not require a very sophisticated crystal ball to forecast that over the coming decades our community will progressively divide into those who instruct computers, and those who just carry out their instructions.
We can congratulate ourselves on being in the right profession on the right side of this divide. However, we should be cautious of complacency. Computing, systems, software, communications and security are all innovating at an accelerating pace, and finding their way into ever more complex and wide-ranging applications. We all need to work hard to keep up.
It is so much more difficult for those on the other side whose jobs, often safe and secure for generations, are modified out of recognition, or even swept away. BCS has a proud history of setting standards and promoting best practice. However, perhaps we are not doing so good a job of informing the community as to the scale and speed of the changes that we can see so clearly roaring towards us.
We have all got used to automated factories and warehouses; on-line buying, banking, booking and ordering; control, scheduling and management systems, social media, GPS, and Google. The entire body of human knowledge, art, culture, commerce, sport and entertainment instantly on line. Many jobs have disappeared, but we have the highest employment rate ever. What is not to like?
However, much closer than the horizon is a rather disturbing side effect. Incomes are not rising: for forty years in the USA, twenty years in Japan, ten or so in the UK. We are familiar with computers taking jobs, but progressively they are taking incomes. Already, many high value careers in finance, law, accountancy, education and management are obsolete.
Will these massive changes be a welcome opportunity to lead less stressful working lives as our computer partners take the strain, or, will these AI systems be seen as predators destroying the fabric of our society?
History repeating itself
This has all happened before. Steam engines created the Industrial Revolution, transforming an agricultural economy, where farm workers had little money and less to spend it on, into the industrial economy where workers earned wages to purchase an ever-widening range of products and services.
The computing revolution is reversing this. It would be truly wonderful if our computers could produce all our goods and services, but where will the purchasing power come from to buy them?
One reason the government has not been able to balance the budget since the 2008 crash is that the revenues from income tax have consistently fallen short of expectations. The city of Utrecht is experimenting with paying every resident a living wage irrespective of whether they work or not. Is this the future?
One thing is clear. Every young person today will probably follow two, three or more quite different careers at various stages of their lives. Rather than making young people pay for a narrow education ‘designed by the priorities of a previous age’, (according to Dr Tony Breslin, Chairman, Industry Qualifications), we should be investing in providing the widest possible range of educational opportunities and exploring how to deliver life-long learning to the whole community.
The Department of Education reports that 45 per cent of teenagers gain little from their school experience, while some 20 per cent leave school ‘functionally illiterate’, yet from these groups come many of the creative, inventive, entrepreneurial and adventurous risk-takers. A significant number make good programmers. All teenagers seem to be able to operate their iPhones and iPads equally well. What are we missing? What to do?
Help from the convergence of biogenetics, cognitive neuroscience and computing
One positive development is that our computers are contributing powerfully to many other disciplines. Our experience of designing software, operating systems and programming languages both compiled and interpreted, is shedding light onto how the human brain works. If we can better understand our brains, perhaps we can learn how to use our computers to improve our own personal individual abilities.
For many years we congratulated ourselves on inventing ‘software’ only to discover that hardware and software is the foundation of almost all natural systems. Biogeneticists have known for some time that one long string of deoxyribonucleic acid, DNA, is sufficient to create a complete human being.
Only recently have we discovered that some of the base pair components create our cells - the hardware; some the instructions that assemble all the individual types of cell; some the electrochemical signalling system - the operating software and the applications programs; some the energy generating systems. Not all that dissimilar to the architecture of neurons.
Similarly, cognitive neuroscientists are using our experience of designing programming languages to better understand the nature and meaning of information, the formation of memory (and possible causes of memory degradation in dementia), and the different types of intelligence.
What we are learning from the neuroscientists is helping us make great strides in AI systems to identify accurate patterns of information in vast data sets - the first steps to intelligence. We have come full circle. Colossus was designed to identify patterns in a mass of enemy signals traffic.
Computer operating systems bear an uncanny resemblance to the autonomic functions of the central nervous system. Compiled programs are like the library of skills we learn and then carry out largely unconsciously - in ‘autopilot’; like walking, swimming, driving a car - general intelligence.
Interpreted programs are comparable to that part of tasks we do when concentrating on a particular application - driving along the correct road, making the right turns, signalling to others - specific intelligence.
World Scientific are publishing a book in the New Year that uses the experience of analysing and designing computer software systems to propose the first comprehensive definitions of human intelligence.
Advances in medicine
The fast advancing world of medicine is now heavily dependent on computer controlled monitoring and operating equipment systems, and could do so much more if there was a database of information of the patients in the National Health Service.
However, some hospital consultants still have to send their reports snail mail to local GPs, who have to scan them into their incompatible systems. The waste of opportunities is only exceeded by the wholesale waste of money.
Response of government
Government seems oblivious. Responding to the recent manifestos of both parties an article in the New Scientist goes so far as to comment that the ‘political response to technological change is too often characterised by ignorance, incompetence and illiberality’.
We have to accept that government has not delivered one single major national IT system on time, to specification, or within budget. One of the worst examples is the £12 billion wasted on the aborted NHS information system. Yet no one in government, or parliament, seems concerned. Our nation’s leaders appear to be in denial.
Yet are we, the computing profession partly to blame? In 1968, the Home Office bought a computer for the Metropolitan Police without a budget for software. Oh, how we jeered, and drafted questions for friendly MPs to ask in the House.
However, with the benefit of hindsight, I doubt if we had explained ‘software’ clearly enough. It was a completely novel concept. Articles in the press at the time spelt the word ‘softwear’.
What should we do? Governments always have problems, the present one more than most, and it is never the right time. But time is of the essence, and it is the right time for our profession to press the government to appoint a senior cabinet minister to co-ordinate all the actions that need to be put in place across every government department to meet this great challenge.
The future, as always, is in our own hands.
Charles Ross' obituary is printed in ITNOW, Volume 61, Issue 1, Spring 2019, Page 28.