Every year, Turing regulars and newcomers alike meet to listen to inspiring speakers at the peak of their subject. These acknowledged authorities on topics related to selected aspects of Turing's work make a presentation at the annual IET-BCS Turing Lecture.
Over the years these speeches have showcased Turing's work and its increasing impact on cognitive neuroscience; how one of our speakers persuaded a reluctant IBM to introduce lower case printing leading to the eight bit byte and how Bill Gates was convinced that Windows needed communication protocols.
The author of object-oriented design has presented and others have shown how computing and information engineering are making the world a better place and, to amend a well-known phrase, that ‘the future’s bright; the future’s machine intelligence’.
In the beginning there were the Turing Memorial Lectures, proposed by the Turing Trust under the chairmanship of Prof Donal Michie, to be delivered by acknowledged authorities on topics related to selected aspects of Turing's work. This eventually became possible after the establishment of the Turing Institute in Glasgow. In association with the University of Strathclyde, the Institute hosted seven public lectures in the period 1985-93.
Some years after 1993 the Turing Institute agreed, in discussion with Prof Keith van Rijsbergen, that these lectures could be continued under a new title: The Turing Lectures. Each lecture was published as a paper in The Computer Journal. In 2001 the lecture became a joint BCS and IEE (later IET) affair, with broadened scope. In recent years it has been given to capacity audiences in England, Scotland and Wales, as well as, since 2003, being streamed to iet.tv audiences.
Edited versions of all eight surviving votes of thanks give an insight into Turing’s lasting legacy.
Passing over Professor Brooks’ achievements, such as being the first and quite likely only computer scientist to travel to the South Pole, several notable threads emerge. First, Fred likes tackling practical problems. For example, correctly anticipating the removal men would say Nancy’s beloved grand piano couldn’t possibly go up the stairs of their new home, he proved them wrong by building a full size cardboard model.
He also persuaded a very reluctant IBM to introduce lower case printing. So what, you may ask? Well, forty years ago this led to the eight bit byte, and thus enabled text processing. And his practical judgement is further underlined by his IBM System/360 microcoding principles, which endure to this day.
A further thread is Fred’s conviction that computer scientists are here to help people become more effective. Modern day master tool-makers, so to speak. One such example is his virtual reality tool set. He helped design his local church that way, to great acclaim. His elegant Chapel Hill computer science building, too, convincing a reluctant architect that his lobby design wasn’t people-friendly. And helping his research chemist friends to design increasingly complex molecules ten times faster.
But Fred’s much more than a brilliant engineer and computer scientist. As we learnt, he’s a captivating teacher, using down-to-earth analogies to explain the most complex of issues, typically limiting his notes to a single index card. And like his great Harvard PhD supervisor Howard Aiken, he mentors and motivates his research students. He’s great on committees, usually intervening only when things go pear-shaped - and then typically perfectly summarising what to do in a single sentence. Best practice collaboration, in fact.
By way of another example, there is a nice story from the System/360 days. Fred was working with John Fairclough, who led Hursley’s side of the project. In 1961, transatlantic calls cost a fortune. Fred was seen marching round Poughkeepsie complaining about an itemised Hursley phone bill running to more than eight hundred densely populated pages. Not about the awesome cost, but that there didn’t seem to be enough communication!
Fred, your lecture has informed and delighted us, adding lustre to the memory of the man who made computing possible. David Morriss would like to present you with a memento of this evening. As he does this, I propose - in the traditional words of this house - ‘that the best thanks of the BCS and IEE be accorded to Professor Fred Brooks, for delivering the seventh Turing Lecture.’
Clearly, Chris is always up for a challenge. As an undergraduate, and already virtually blind, he cycled everywhere, and has the scars to show for regular encounters with roadworks. He’s fearless, going white water canoeing in the Ardeche. As his water-skiing record attests, he’s fiercely competitive. And having recently reluctantly given up international waterskiing, he has taken to snow instead.
He’s determinedly persuasive, able to convince Bill Gates that Windows needed communication protocols - and of course that they should be data connections. The first time Bill came to Enfield, he grilled Chris for ten hours. At ten o’clock, Chris politely excused himself, explaining he had to be up at six next day - to solo water ski the channel for charity.
Now, a few things to take away from this lecture. Let’s help to get those websites improved. How about registering as an ITCH (IT can Help) volunteer? Or learn more about AbilityNet, the charity BCS co-founded, which helps 300,000 disabled people each year, to see whether you or your employer could help with cash or kind. And something for our two societies. Chris is concerned that though there’s plenty going on, it needs joining up. How about setting up a forum to help that, for example by getting inclusion principles into codes of practice?
There’s so much more to Grady Booch than influential publications, innumerable Google references and his thought-provoking blog. A blog which he updated only hours after coming round from last May’s massive heart re-plumbing operation, by dictating it to his Mayo Clinic intensive care nurse.
The OED defines a polymath as a great scholar and a person of much and varied learning. And a mentor as a wise counsellor and a faithful friend. Grady is all these things. He’s already created several potent software silver Bullets, starting with object oriented design. His patterns and services assembly work promise more.
That forthcoming Handbook on Software Architecture seems certain to harvest important lessons from legacy archaeology. And no one here tonight can be in any doubt about why Grady’s the world’s most sought-after software system designer.
Grady not only encourages people with ideas for making beautiful software easier to write, he then does everything possible to help them realise them. He has inexhaustible stamina. The day after crossing the pond to give this lecture, he completed three two-hour engagements before Boulder breakfast time. And today he held the first of four architecture handbook case studies meetings at 7:30, then met the press this afternoon.
Yet there’s one thing I don’t quite understand. And that relates to the H-scale, the widely used measure of academic standing. Against that, Grady scores just three. Considering his massive impact, how come this modest score? There’s something wrong. And it’s not Grady.
By way of further example, read his October Blog, - laying out his stall on architecture, and deliciously distinguishing between the snake-oil and service oriented varieties of SOA. He had the Turing champion rise like a trout to a fly with his sneaky allusion to teaching Kate Middleton the subtleties of Ajax and the inference that she’s probably a bit weak on abstraction skills. And, for a while, he had the Turing staff worried, too. They’d asked about his audio visual needs.
He replied that the stage needed to be a strong one, so as to accommodate the dancing elephant. And although he wasn’t bringing his acrobatic troop with him this time, the curtains round the stage needed to be fire-proof, as he’d be setting off bangers to emphasize really important concepts.
Oh, and paramedics would be needed for the faint-of-heart swooning at the rather explicit pictures he threatened to use to illustrate the beauty of software. Well, Grady, if any of us swooned tonight it would have been because of your infectious enthusiasm and exciting ideas.
Google famous people from Ashby de la Zouch, and you’d up come Robin Hood and Wilfred of Ivanhoe, one a legendary romantic; the other a Saxon who lost out to the Normans. Guess whom they missed! After Oxford and national service, James Martin became an IBM systems engineer, installing 305 RAMAC machines.
These weighed 30 tonnes (he recalls the one in Lloyds Bank Birmingham needing pit-props to stop the floor caving in), had an amazingly slow two mega-character drum memory, were programmed in machine code and - as James found out - had an internal shelf which could store things like the customer engineer’s whisky - and that in an era when alcohol on IBM premises was a dismissible offence! He went on to program Panamac and Boadicea, the world’s first international airline reservation systems.
In 1966, he joined the IBM Systems Research Institute. Rather like the Cambridge’s Microsoft Research group does today, SRI staff lectured, researched and published at will - as James said, just like university faculty, except for pay and conditions! About ten years and several real-time bestsellers later, he was skiing with his friend John Collins, then a Lancaster lecturer.
Over a beer or three, they decided to take a year’s leave of absence. Around midnight, they strolled out into a spectacular aurora borealis. Buoyed up by this augury, not to mention the beers, they walked through the Vermont hills until dawn, planning.
A year later, when James decided to stay out, IBM reportedly asked a psychologist to persuade him he was making a great mistake. When that failed, they generously told him they wouldn’t hold this mid-life crisis against him. And although his next thirty years is now history, there is clearly much still more to come. Like his plans to write more books, produce another film and foster his Oxford 21st Century School.
And what can I say about the lecture? Marshall McLuhan famously said ‘there are no passengers on spaceship earth: we are all crew’. Tonight, though, we’ve also had a pilot on board. We’ve been given a Target Earth SWOT analysis, including practical ways to survive.
We’ve listened to a man who for throughout his career has kept ahead of the wave, accurately predicting the technological future. This month alone has seen the publication of heavyweight UK and US reports on end-of-life-as-we-know-it issues, each underlining James’s prescience. He has laid down a challenge for us to help safeguard mankind’s future, one in which all of us can play our part.
System /360 designer Fred Brooks famously inaugurated the Manchester Turing event with a talk on telecommunication and design; Chris Mairs spoke about computers helping blind people, diverting to tell us about Virgin Rail’s well-meaning but ineffectual lavatory guidance.
Grady Booch struggled out of five feet of Colorado snow to treat us to his vision of the beauty and promise of software. And a year ago, James Martin ambitiously tackled the ten greatest challenges facing mankind. And tonight, another amazing lecture, a fitting tribute to Alan Turing, showing how computing and information engineering are making the world a better place, with much more to come.
Mike stands still, whether as a demon on the squash court, giving a lecture or expanding his lab. His colleagues tell me Brady’s Law is to double his department every two years. And his professional activities are legion. He is a passionate and active supporter of Oxford’s tutorial scheme.
He has ongoing projects with ten clinical departments and six companies, demonstrates to first year computing and second year electronic labs and teaches five modules. For example, he lectured in Oxford this morning and will be on again at noon tomorrow.
Turning to this evening, it is clear an hour was simply not enough. For there was much we didn’t hear about. Like DarpaNet pioneering work which was prerequisite to Tim Berners-Lee’s World Wide Web. And, as Mike said, about 99 per cent of his 24 years in Oxford. But there was still plenty to enjoy.
We learnt about his rare triple ability to derive fundamental mathematical [uncertainty and complexity] theory, use it to model practical problems and turn the results into exciting products and services. We followed autonomous vehicles, visited Fawlty Towers, were coached on face recognition mathematics, discovered how to read Roman soldiers’ letters home, were introduced to visual nouns, learnt slightly queasily about colorectal tumour diagnosis and got a taster of great things yet to come.
It seems to me that Mike bears comparison with another luminary, a name I spotted when he was rebooting his system just now. The great physics Nobel Laureate Richard Feynman once said. ‘I’m an explorer, okay! I get curious about everything and I want to investigate all sorts of stuff. Find out more about the world’. Like Feynman, Mike unstintingly gives credit to others, is a great enthusiast, a natural teacher and a team player par excellence.
And that goes a long way to explaining why so many smart people like to work with him, including 105 DPhils, so many now occupying distinguished and influential positions in industry, commerce and academe. And his colleague Ron Daniel recently captured the essence of what Mike does when he said, ‘the real function of a university is to produce people: students with stars in their eyes and a belief in what they can achieve’ and went on to say that he does all that and much more.
After so gripping a lecture, giving us a taste of future computing breakthroughs with practical implications as profound as the transistor and the web, here is a little more about our speaker, things Chris’s bio and the IET You Tube clip didn’t mention.
Like his two outrageously ambitious childhood ambitions. The first was to give the Royal Institution Christmas Lectures! The other was to go into space. And while he hasn’t yet managed that one yet, he’s working on it, with his training so far including weightless sky-diving and aerobatics and his PhD under the man who conceived the boson particle.
Nor was there mention of his work as a Culham theoretical physicist. Given what you’ve heard this evening, you may be surprised that Chris considers his insight into using magnetically-confined plasma to convert nuclear fusion into power is still by far his best achievement.
But Chris likes to do things with a practical outcome and saw fusion was far out of reach. So he broke his vow to have absolutely nothing to do with computers and moved on to machine intelligence. Given that only this week British scientists made their latest confident claim that viable nuclear fusion power generation was now only 20 years away, that was surely the right decision.
Does Chris have any weaknesses? None that my moles could identify. Except perhaps such excessive enthusiasm to explain scientific phenomena over the family dining table as to get a regular ‘Oh Dad!’ reaction from his two sons. By the way, they were the ones on the front row in the Christmas Lecture clip who winced as the pendulum weight suspended from the roof of the Royal Institution’s Faraday lecture theatre thundered towards him.
What shall we remember from this evening’s lecture? A brilliant speaker taking us on a wonderful journey, explaining difficult things with clarity and infectious enthusiasm and showing us that the best is yet to come. To paraphrase a well-known saying ‘the future’s bright; the future’s machine intelligence’!
This year one of the world’s most revered and influential scientists has done us proud.
Along with his prodigious text and programming achievements, Don Knuth worked alongside Dijkstra and Backus pioneering compiler writing. He’s written at least one novel. If those hexadecimal cheques Alan Bundy referred to really exist, they’re worth far more framed than cashed; and some of his inimitable handwritten replies fending off letters inviting him to speak are now part of my family heirlooms.
Sharp-sighted speed readers of that continuous loop projection of Fun and Games will have spotted that in 1957 Mad Magazine issue 33 published his schoolboy weights and measures article - on the whimsical Potrezebie system, with the thickness of issue 26 as its unit of length; and the software world owes a great debt to the mercifully anonymous author of the unspeakably awful IBM 650 manual programming examples that lured Don into his computer science career.
As a Case undergraduate, Don’s basketball performance analysis program helped the University team win the league championship - subsequently earning him a Walter Cronkite CBS interview and a Newsweek write-up; his TEX line break algorithm ideas came to him after using graph theory to help his wife design their kitchen; and he confidently predicted finishing Volume 4 by 2003!
When it comes to computing, the Knuth household is dysfunctional. Don uses Macs, Jill PCs. And notwithstanding received wisdom as to who’s the in-house computer expert, if Jill mentions she has a problem Don mostly looks worried, says he needs to see the source code and suggests she looks elsewhere. Yet when he has printer problems, guess who fixes them.
As many of you will know, Don has a wonderful website. It does however make one highly misleading claim - that in 1990 he retired.
For a reliable mole told me that - in just one week last month: his publishers delivered the first copies of TAOCP Volume 4a and Fun and Games Volume 8; he had to defer taking a scheduled mystery international ‘phone call to fix an emergency dental appointment; that when that call eventually got through he learnt he’d won another prize - worth a half million dollars - most destined for charity; Jill and he did a Skype recording for a ceremony honouring their son as Teacher of the Year; he lectured at Stanford; and he worked on ideas for volumes 4b, 4c and 4d!
As regards tonight, we’ve met a true polymath. A brilliant conversationalist, responding to challenging questions on all matter of things with clarity, infectious enthusiasm and a wicked sense of humour. He’s informed and delighted us, adding lustre to the memory of the man who made computing possible.
Every year, Turing regulars and newcomers alike meet to listen to inspiring speakers absolutely at the peak of their subject. In that respect I would like to mention the excellent book: Alan Turing and his Contemporaries published by BCS, with all proceeds going to the restoration projects of the Computer Conservation Society.
I think that you will all agree that Ray, in discussing the heritage of Turing's work and its increasing impact on cognitive neuroscience, has given us a wonderful fresh insight into the work and intellectual reach of one of the most important pioneers of all time and one of my own personal heroes.
He has for example shed further insight into how Turing's strongly Bayesian problem solving approaches have advanced developments in understanding the workings of the brain and the human mind. It has also been fascinating to hear something of Ray’s work into human decision making and its aberrations, particularly as they are expressed in psychopathology.
Ray has published over 400 peer review papers in a career spanning more than 20 years, culminating these past ten years as the world’s most cited neuroscience and behaviour practitioner. As you heard earlier, he has deservedly received significant worldwide recognition for his contribution to knowledge.
Professor Samson Abramsky has MA degrees from Cambridge and Oxford and a PhD from the University of London. Prior to his current appointment, his first chair was at the Imperial College of Science, Technology and Medicine. His fields of scholarship include semantics, logic of computation, concurrency, domain theory, lambda calculus, semantics of programming languages, abstract interpretation and program analysis. He became a Member of Academia Europea in 1993
As computer systems have developed from stand-alone batch processing to distributed systems and on to today's ‘global computing’ on the internet, the classical notion of ‘computation’ has itself undergone a radical transformation.
Instead of a program working in isolation to produce an output from an input, we have complex systems of agents interacting with each other to achieve some global effect. In the course of this development, the notions of agent, interaction and information flow have taken centre stage.
On the one hand, this forces a re-examination of basic ideas, perhaps even in the foundations of logic itself; but in return, these newly emerging ideas may form part of the basis of a genuine science of information.
Brian Randell graduated in Mathematics from Imperial College, London in 1957 and joined the English Electric Company where he led a team which implemented a number of compilers, including the Whetstone KDF9 ALGOL compiler.
From 1964 to 1969 he was with IBM - mainly at the IBM T J Watson Research Centre in the United States - working on operating systems, the design of ultra-high speed computers and system design methodology.
In 1969 he took up his present position as Professor of Computing Science at the University of Newcastle upon Tyne, where in 1971 he set up the project which initiated research into the possibility of software fault tolerance and introduced the “recovery block” concept. Subsequent major developments included the Newcastle Connection and the prototype Distributed Secure System.
He was principal investigator on a succession of research projects in reliability and security funded by the then Science and Engineering Research Council, the Ministry of Defence and the European Strategic Programme of Research in Information Technology (ESPIRIT).
Most recently he has had the role of Project Director of DeVa, the ESPIRIT Long Term Research Project on Design for Validation, and of CaberNet, the ESPIRIT Network of Excellence on Distributed Computing Systems Architectures. He has published nearly two hundred technical papers and reports and is author or editor of seven books.
As individuals, organisations and indeed the world at large have become ever more dependent on computer-based systems, so there has been an ever-growing amount of research into means for improving the dependability of these systems.
In particular, there has been much work on trying to gain increased understanding of the many and varied types of faults that need to be prevented or tolerated in order to reduce the probability and severity of system failures.
In this lecture Professor Brian Randell (University of Newcastle) will focus on the following key issues: assumptions that are often made by computing system designers regarding faults; surveying a number of continuing issues related to fault tolerance; and identifying some of the latest challenges facing researchers in this arena
Nick Donofrio leads the strategy for developing and commercialising advanced technology across IBM's global operations. He is chairman of IBM's Corporate Technology Council and chairman of the board of governors for the IBM Academy of Technology.
He joined IBM in 1967 and spent the early part of his career in integrated circuit and chip development as a designer of logic and memory chips. He held numerous technical management positions and, later, executive positions in several of IBM's product divisions. He has led many of IBM's major development and manufacturing teams - from semiconductor and storage technologies, to microprocessors and personal computers, to IBM's entire family of servers.
Nick Donofrio is a strong advocate of education and vigorously promotes mathematics and science as the keys to economic competitiveness. He is particularly focused on advancing education, employment and career opportunities for under-represented minorities and women.
He is the holder of seven technology patents and is a member of numerous technical and science honor societies. He is a Fellow of the Institute for Electrical and Electronics Engineers, a member of the US National Academy of Engineering, a member of the Board of Directors for the Bank of New York, visiting Professor at Tsinghua University and a member of the Guangdong Economic Development Council, China.
A million times more bandwidth; a billion people conducting global transactions through a trillion Internet-connected devices; a secure, reliable and self-managing computing infrastructure; unprecedented processing power for solving previously-incalculable problems; new and profitable digital relationships. All are coming, and soon. What will they mean for enterprises, institutions and individuals? What will they do for content, communities and commerce?
For nearly a century, IBM has put advanced technology to work on humankind's toughest challenges. Today, the company's growing investments in technology, research and development are forging a new Internet era in which browsing and dot-com hysteria gives way to bona fide, 21st-century business.
It's an environment in which an enterprise will tightly link every aspect of its infrastructure and connect itself with other enterprises, markets and industries. It's an environment that requires not just constant technology innovation, but in-depth knowledge of industries and their challenges.
It's an environment in which advanced technology will be integrated seamlessly into the core of every business operation. It's an environment that demands the complexities of technology be invisible, and the productivity and quality of our lives will consequently be vastly improved.
Nick Donofrio, IBM's senior vice president for technology and manufacturing, will outline that environment and describe the industry's capability to innovate, invent, integrate and capitalize on emerging technologies, to leverage new marketplace trends and to arm customers with competitive advantage. He will also encourage discussion on those topics, as well as the technology issues facing institutions and individuals around the world.
After graduating from Leeds in 1979 and a PhD at Bristol, Mark Welland worked at the Garching Max Planck Institute and the IBM T J Watson Research Laboratory. He pioneered atomic level resolution scanning tunnel microscopy and invented the scanning probe microscope, two key enablers for one of today’s hottest engineering topics, nanotechnology. Welland joined the Cambridge Engineering department in 1986 and is today Professor of Nanotechnology.
His primary interest is to apply his understanding of nano-scale science to solving interesting problems - ranging from nano-particulate inhalation influence on tumour formation to the fabrication of novel optical, molecular, electronic and magnetic nano-structures into practical devices for next generation products.
Mark directs the new IRC for nanotechnology and the Cambridge interdisciplinary Centre for Nanotechnology. He advises several governments on nanotechnology policy and is Principal Investigator for numerous UK and EU grant awards. His editorial activity spans six international journals across a notably catholic spectrum, including ‘Lab on a Chip’.
He nevertheless still finds time to teach and to reshape Cambridge’s electrical and information science (EIST) course and his long-running micro-electronics MPhil (the first of its kind) continues to prepare many students for their own doctorates. He has founded several companies, which manufacture products ranging from nano-actuators to bio-sensor protein sequencing machines, and his magnetic storage discoveries have attracted substantial venture capital funding.
The anticipation of computer chips with ever increasing complexity, speed and data storage density will ultimately rely on atomic scale engineering - technology at the scale of 1 nanometre. Nanotechnology is of fundamental importance here in that it provides the tools for the fabrication of devices, structures and materials.
But nanotechnology has a lot more to offer. Firstly there is the opportunity for disruptive technologies to offer an alternative or radically different approach to produce a computer chip or data storage medium.
The history of the evolution of the computer is littered with such innovations; the progression from thermionic valve to transistor being one example. The second, and ultimately greater opportunity, is to develop new types of devices, sensors and materials based on a combination of technologies straddling the life and physical sciences, and engineering.
Such developments are expected to make nanotechnology a pervasive technology with applications ranging from medical sensors for the diagnosis and treatment of disease to high efficiency solar cells. We are at the start of this revolution but already the signs of a .com style boom are with us.
It is appropriate to take a realistic look at what exactly nanotechnology is, what it tells us about the behaviour of matter on the scale of a few atoms and how might this realistically develop over the next decade.
Carol Kovac has responsibility for IBM's overall strategy for Life Sciences, including developing partnerships and directing IBM investments in this fast-growing emerging market. She spearheads the development of information technology solutions for the life sciences market, including biotechnology, genomic, e-health, pharmaceutical, and agri-science industries.
Dr. Kovac's organisation brings together IBM strengths in such areas as e-business, supercomputing, data and storage management, data mining and knowledge management, as well as IBM's world-renown research expertise in computational biology and parallel computing to deliver leading edge solutions for life sciences.
Dr. Kovac joined IBM in 1983 and prior to assuming her current position in 2000 held executive management positions at IBM Research, including vice president of technical strategy and division operations, and head of IBM Research efforts in computational biology.
The announcement of the human genome sequence in 2000 sparked an explosion of scientific discovery in biology and life sciences, which in turn has created the need for powerful new computing solutions. The new knowledge that is coming out of life sciences projects today will change the world as much or more than the Internet, and will transform the pharmaceutical and healthcare industries and profoundly improve the practice of medicine.
Information technology is a key enabler to this revolution, to handle the enormous volumes of data and to create powerful new analytical tools for mining valuable information from these vast databases. Life sciences applications - now and in the future - are driving the roadmap for high performance computing as well as for collaborative, grid-based scientific computing environments.
The convergence of computing and biology promises to transform the process of drug discovery and development and speed the ability to create effective and safe new medicines and introduce them to the market.
Finally, the application of new technologies in biology and computing are already making their way into medicine, in an area we call 'information-based medicine'. In short, the convergence of computing and biology is paving the way for new scientific discovery, value creation in pharmaceuticals and healthcare, and enormous benefits to humankind in medicine.
Professor Fred Piper is widely regarded as an enthralling speaker. He has been Professor of Mathematics at the University of London since 1975 and has worked in security since 1979. He is currently Director of the Information Security Group (ISG) at Royal Holloway. Royal Holloway ISG offers MSc's in Information Security and Secure Electronic Commerce and has a PhD programme that has produced over 100 doctorates.
In 1985 Fred formed a consultancy company, Codes & Ciphers Ltd, and since then he has been consulted by more than 100 companies and government across the world. The consultancy work has been varied and has included algorithm design and analysis, key management and security audits of large networks.
Fred has lectured worldwide on Information Security, both academically and commercially, with recent emphasis on the use of digital signatures and the role for public key infrastructures.
He has published more than a hundred research papers and is joint author of Cipher Systems (1982), one of the first books to be published on the subject of protection of communications, Secure Speech Communications (1985), Cryptography: A Very Short Introduction (2002), and an ISACA research monograph on Digital Signatures (1999). He has been a member of a number of DTI advisory committees and is a member of the Board of Trustees for Bletchley Park.
In 2002 he was awarded an IMA Gold Medal for “Services to Mathematics”. In 2002 he was also awarded the first honorary CISSP for a European. This was for ‘leadership in Information Security’. In 2003 Fred received an honorary CISM for ‘globally recognised leadership’ and ‘contribution to the Information Security Profession’.
Information technology dramatically affects the way business is conducted, the way we communicate and keep records, and how law enforcement and national security are handled. Our almost total reliance on IT means we are beginning to experience the serious impact of massive disruptions, such as losing the ability to communicate or do business.
We need security technologies to keep society working, as well as to protect against loss of privacy, alteration of critical information and unauthorised access to confidential information.
This lecture looks at some of the technical security mechanisms used for protecting our infrastructure by providing confidentiality for information; entity authentication over distributed computer networks and the detection of alteration to information.
It discusses some of the social and political problems that can result from their use and from the fact that the same technology can be used by law enforcers (to catch criminals) and law breakers (to avoid being caught), as well as by businesses (to protect their assets) and by individuals (to protect privacy and preserve confidential data).
Further, every advance intended to protect the 'good guys' from the ‘bad guys' can work in reverse. Clearly, there is a need to find a balance in trying to meet the rights and expectations of the various sectors of society.
The resilience of our national infrastructure depends on these security technologies. However, these technologies themselves require a second infrastructure which establishes trust and facilitates their secure implementation.
Any defect in this second infrastructure could have profound consequences, as evidenced when trust is abused in political or accounting processes. This leads to a discussion of the Human Rights convention and relevant recent legislation.
Cryptography, a subject to which Turing was no stranger, is essential to many security solutions and there will be reference to the work that he and his colleagues undertook at Bletchley Park.