'The talk is going to be about the cognitive computing era,' says Dr Guruduth Banavar, addressing the topic of his 2017 BCS/IET Turing Lecture. Over the past few years, he explains, we've witnessed the establishment of a new era in computing - the age of machine learning. And, as we move into this new age, the resulting technical, professional and societal changes will be profound.
Rounding off his summary, Dr Banavar asserts: 'It means having a very different relationship with machines. We'll need to start getting used to having machines with us, to having natural conversations with them, and get used to the idea that they'll be doing a lot of tasks in every part of our lives.'
Dawn of a third age
If you're a student of such things, the Tabulating Systems Era began in the early 1900s and ran to the 1950s. The Programmable Systems Era - the if and then epoch - began in the 1950s and has served us well. It's the foundation of much of the digital world that surrounds us.
'That's all changing now,' Dr Banavar asserts, 'because we have such huge amounts of data. And we have a number of technologies that can learn from that data.' This means, he says, that professionals - be they lawyers, doctors, teachers or engineers - will soon be working in collaboration with computers.
Pointing to medicine as an example, Dr Banavar explains that its machine learning platform, Watson, is already supporting clinicians to make diagnoses (see ITNOW September 2016 issue, page 46-47). Watson does so through ingesting huge swathes of research papers, state of the art knowledge and patient notes. It then presents probable diagnoses to the medical team. The aim, Dr Banavar says, is to augment professionals' knowledge and capabilities through machine and human teamwork.
A career in technology
Born in India, Dr Banavar spent the first half of his life there before moving to the United States. 'I did my graduate school in the US and, after my PHD, I joined IBM at the TJ Watson research centre' he recalls. 'Since then, I've held a number of very interesting roles at IBM.'
At IBM he started out as a computing science researcher, looking at programming languages - what he describes as traditional computing disciplines. 'I also have a background in language technology. This includes natural language processing and logical programming. That's all relevant to what I'm doing today,' he explains.
The motivation to speak
So, why did Dr Banavar take on the challenge of speaking at the 2017 Turing Lecture? 'Turing is one of my heroes', he enthuses. 'His vision of what computers can do... things like the Turing Test. It measures the limits of what computers can do. These things have always been a guiding light and are very relevant to my work.'
'Certainly the Turing Test was formulated in an earlier time', he continues, 'and, our understanding of what machines can do has changed. But, [at IBM] we're thinking a lot about a modern version of the Turing test. How should we test a machine's capacities and its intelligence?'
Meeting a thinking machine
IBM's work, and a big part of Dr Banavar's career, have been focussed on Watson - a machine learning and natural language processing platform named after IBM's founder, Thomas John Watson Senior.
Watson gained prominence in the popular consciousness when it won at the US game show Jeopardy!. The 2011 victory was an important proof of concept and, since then, Watson has developed many new skills.
'It can now understand images, the tone of somebody's language and even the personality type of the person who may have written a passage. Watson also now has many capabilities for speech-to-text and text-to-speech,' Dr Banavar explains. 'It's also developed several different decision-making capabilities such as trade-offs and the ability to evaluate different options.' These capabilities are provided to software developers through a set of APIs.
What can cognitive do?
Cognitive computing platforms like Watson, Dr Banavar is keen to point out, aren't intended to replace workers. Rather, the cognitive computing revolution is all about computers and humans working together.
'These systems aren't autonomous,' he says. 'They have a partnership with a professional who, ultimately, makes the decisions. Think about a doctor who, with one of these machines, can practice their daily medicine. Think about lawyers, teachers and engineers - every profession will have these kinds of systems that will help with expertise. Experts will become better than they were in the past.'
How does Watson work?
To achieve all of this, Watson is constantly ingesting huge amounts of knowledge. When the system is asked a question it finds answers that are likely to be correct by exploring this ever growing corpus of information.
'When a prescribed time limit has passed', Dr Banavar explains, 'the systems stops all the algorithms that are looking for answers and presents all of the likely solutions in a probabilistic fashion. We then use a number of different scoring techniques. They look at the accuracy of the inference and the credibility of sources. At the end of that process is a set of answers with confidence levels.'
Depending upon the way Watson is being used, the machine can pick the answer about which it is most confident or, when its working collaboratively, a person can look at the top answers and pick the one that is strongest.
This approach to problem solving is very different from artificial intelligence, Dr Banavar explains. 'AI,' he says, 'tries to replicate everything that a human brain is able to do. Cognitive is about making machines that can augment human intelligence.'
'We believe,' he continues, 'that humans are intrinsically good at certain types of intelligence. And machines are good at other types of intelligence.' Humans, he explains, excel at value judgements while machines are good at ingesting huge amounts of data and reasoning. Cognitive, in summary, offers the best of what humans and machines can offer combined.
A child of the times
Watson and similar cognitive platforms are, Dr Banavar explains, very much born from necessity. 'There is a different kind of Moore's Law happening right now, a different kind of exponential growth and it's happening in the world of data', he says. 'In the last few years there's been a huge amount of data created and I'm not talking about transactional data - the stuff that the previous programmable era of computing was concerned with.'
There are, he says, new types of data. We all now regularly create photographic, video and social media data. And that's the start. Think about the internet of things, says Dr Banavar. 'All the sensors and devices that are embedded in large real world systems like factories, transportation systems and energy grids. All of that information is available to us in digital form and that information is increasing exponentially.'
Beyond a need to work with data differently, cognitive is also being powered by another great shift in computing: computers themselves have changed. 'We now have more scalable platforms that come from cloud computing, we have high performance computing, more memory and GPUs, ' he says.
Cognitive's third enabler, Dr Banavar explains, is how algorithms have developed and matured. 'We had neural network algorithms thirty or forty years ago,' he says. 'In recent years though we've come up with new hardware and data structures and they have given rise to a whole series of new methods of using data and to model data sets.'
Championing The Turing Legacy
'The lectures cement Turing's contribution to computer science and also, more importantly, they inspire the next generation of computer scientists', says Professor Jim Norton - the current Turing Champion.
This year's lectures will visit four cities - London, Cardiff, Dublin and Belfast. 'We've gone to Northern Ireland for the last two years and we fill the Belfast city hall with more than 450 people,' he enthuses. 'The tech companies around the city - firms like Intel - bus young people to the event from across the province.'
The best questions the speaker receives, he says are from young people, and that is just as Professor Norton likes it. 'That's where we want the lectures to be. And that's really having an effect on how people perceive computing and a career in computing.'