The Art of Computer Programming

June 2011

Don KnuthWhile he was over in the UK for a book tour and lecture series, Professor Donald Knuth made time to talk to BCS editor Justin Richards about his life and works. This interview also appears in the ebook Leaders in Computing.

Donald is author of the hugely respected The Art of Computer Programming book series and dozens of highly regarded academic papers on computer science.

You’re probably best known for your book series The Art of Computer Programming. In 1999, these books were named among the best twelve physical-science monographs of the century by American Scientist. How did these books originally come about and how do you feel about the American Scientist distinction?

The books came about because in the 60s, when I began, everyone was starting to rediscover things because there was no one good source of what was known and I had enjoyed writing all the time. I was involved in newspapers at school and magazines and thought of myself as a writer and I realised there was a need for someone to get down all the good ideas that had been published and that we were already forgetting.

This was back in the earliest days when the number of people actually studying computing was probably less than a thousand. I didn’t see it as affecting the world but I still thought it was pretty cool stuff and ought to be organised.

Then I thought about who else could write such a book and everyone I thought of I thought they’d probably only mention their own stuff and I was the only guy I knew who hadn’t invented anything himself, so I figured I could be neutral and I could be a spokesman for the other people. And really that was the original motivation and I saw that there was this need there.

I started writing the book and naturally because I was trying to combine the ideas of many different people I would see where one person had analysed his method in one way while another, for a competing method, had analysed it another way. So I had to analyse method A according to author B and method B according to author A.

Therefore I ended up creating an original work just by analysing these things and pretty soon I realised there were a whole bunch of interesting scientific approaches here that hadn’t been part of my own education that were really coming together. Over and over again I was really seeing this way of thinking as necessary in order to get the story right.

So, to make a long story short, pretty soon I had my own axe to grind too; I started discovering things too and I couldn’t be an unbiased writer anymore. However, I still kept to the original idea of trying to summarise everybody’s ideas in the fairest, most reasonable way I could.

Now, to be put into that category of one of the best books of the century, that’s a little bit embarrassing as they rank me with Einstein and Feynman - I’m not in that league really, I just didn’t have as much competition - they had to have a token person in computer science! But still, I worked hard and I think it was necessary to comment on the research so far, but it’s a bit like comparing apples to oranges when they chose me to represent computing.

What is it about computer science that drew you to it?

I was born to be a computer scientist - I have a way of organising stuff in my head that just seems to make me a good programmer. I think everybody can learn to use computers, but only about one person in every 50 is a geek in the same way as I am. That means we can push the envelope and can resonate with the computer. The way we think helps to make it easier for us to know how to construct a machine.

Why do you think computer science is so important?

Computer scientists are important because of the way they affect communication and, I’m sorry to say it, but also finances. Unfortunately, the world measures what my colleagues and I do by how much our work affects Wall Street. I’m jealous of astronomers, for example, because people respect astronomers for doing astronomy because it’s interesting just for its own sake. I studied computer science because it’s interesting to study computer science.

The term IT doesn’t resonate with me so much - it’s the science that interests me. To me the IT is very nice, but it’s not something that I’m particularly good at. My wife can figure out what these icons mean and what to click before I can, but there are so many scientific challenges in order to get machines to do complicated, sophisticated things. The ideas are subtle, the questions are fascinating. There are many questions I never thought I’d know the answer to, but gradually we’ve learned how to solve them. For me I would do it even if there was no money in it.

So you have a passion for it?

Yeah, it’s like I wake up in the morning thinking I’ve got to write a program.

Do you have a muse?

Yeah, well some days she talks to me more than others. There was a period when I almost thought there was a muse dictating to me.

In your opinion what do you think is your greatest achievement in the field of computer sciences?

I guess the first thing I did well at was when I worked on the theory that goes on behind how compilers work. I worked on the theory that underlies algebraic languages, and, as I was writing The Art of Computing book, (Chapter 10), I was describing what everyone else had done but then I realised that there was a way to bring these things together and I didn’t know how to explain that in a book, it was too far out, so I published that theory in a paper and other people figured out what I meant and this became the theory of parsing that’s used in all algebraic compilers now.

But I feel the biggest thing that I developed was the mathematical approach to compare algorithms in order to find out how good a method was so I worked out quantitative ways you could say that one program is going to be, say, 2.3 times better than another one and the mathematics that goes with it and it’s called the analysis of algorithms. It’s what I’m most proud of - in developing an academic subject, but it’s key to the successful use of the machine.

When I came up with this approach I said to my publishers let’s rename the book and call it The Analysis of Algorithms and they said ‘we can’t, it will never sell’! But that’s really what my book is about - it summarises the work of all these people, but it also helps us decide, in a quantitative manner, how good each method is.

You’ve said on your website, in response to the question - why don’t you do email? - ‘Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things.’ Can you explain your stand on email and what you meant about being on the bottom of things?

Someone has to not be tweeting all the time, someone has to be thinking about things which need a long attention span and trying to organise material and build up strong foundations instead of rushing off across the frontier. It takes a long time to put out something that has the right style; I have to really think about it and if I’m going to do it right I have to spend a lot of time focussed on it. And I was being treated like an oracle, lots of people from around the world were asking my opinion about whatever, so after 15 years of email I decided that was a life time’s worth.

A previous Turing Lecture speaker, Grady Booch, was very much an advocate of making coding simpler and, according to blurb regarding your winning the BBVA Foundation Frontiers of Knowledge Award in the Information and Communication Technologies category, you are too. Can you explain why you think code should be kept simple, compact and intuitively understandable?

I guess you have to go back to Einstein who said ‘keep it as simple as possible, but no simpler’. It’s an illusion to think there’s going to be some sort of ‘royal road’ and everything is going to be easy to understand, but almost never do I find something that can’t be simplified if you have the chance to rethink it again. So every once in a while people have to say ‘well, let’s chuck everything we have and start over again, in view of what we know now’.

There’s a project at Stamford, that started a few years ago called the Clean Slate Project that said ‘let’s figure out a better way to do the internet’. Things just keep getting added on and accumulate and you realise that there’s plenty of baggage which there’s no reason to have anymore.

It’s like the appendix in the human body, there was probably some purpose for it at one time, but not now. So I think there’s the potential, although I think maybe it’s not possible because the world is so dependent on it, for someone to come along and say let’s start again with the way programs are loaded into machines. It’s like when Linux came out, that was an attempt at the simplification of operating systems.

Another ideology that you share with Grady Booch is that you have both said that you can appreciate the beauty within coding and programming - what do you mean by that?

I’m thinking of it in several senses of the word ‘art’, but in general the word art means something that is done by human beings, and is not a part of nature. But then there is fine art, which brings aesthetics into it as well.

In many ways beauty is in the eye of the beholder, but you do something because it’s elegant and hangs together and is a pleasure to read as well as to write or to see someone else’s work; you feel that you’ve got it and you can take pride in it having achieved certain criteria.

Maybe Grady and I don’t agree on the criteria, I mean no two people agree on what’s the best kind of music in the world, but musicians certainly know what they like and what they don’t like and they know when they’ve done something well and that’s the way I look at a program.

I guess it’s down to personal opinion at the end of the day?

Yes, indeed. There’s no algorithm that you feed in and say isn’t this beautiful or what? Although people did try - there was a book that came out in the 1930s by one of America’s greatest mathematicians, George [David] Birkhoff, called Aesthetic Measure and it was filled with all kinds of formulas and there was a page filled with all kinds of Greek urns and next to each one would be a number which would say how beautiful the urn was.

He classified a whole bunch of different design systems; it’s kind of interesting as number two or three in his list of 100 was the Swastika - he was a kind of Nazi sympathiser. I guess it has a greater religious significance in Hindu if it’s reflected left-to-right. I don’t believe there’s a way to measure it but he did and some people have tried.

So no one has written a program to work out the beauty of a program yet?

No, not really, although there’s software engineering that tries to do this because they have to measure something - I don’t really know. You know that, as a writer / reporter and you have a lot of stories, you just have to find quantitative numbers to accompany the text - x amount number of people have died in Cairo, you have to know whether it’s 300 or 315, that’s part of the news story. Qualifying things adds quality. I try to find reason for numbers too, but software engineers are trying to measure how good a programmer is; their bosses know better!

I think numbers are there so people can do a mental comparison and think 20 people have died in that event and 50 in that event so by contrast the latter event must have been worse.

But it’s like comparing apples with oranges because when you do something to a number then you can start to play games and make the number high even though the thing isn’t right.

You can take education and an educated student and think, well, how are they going to do on this test and out come the books on how to past this test rather than how to learn science.

It’s all about how to get a good score on a science test. And that’s the problem with these numerical things; they don’t always capture the essence of it. Once you have a way to quantify something then, if your goal is to cheat you’ll figure out a way to cheat, when the goal really is to learn.

You’ve said in the past your work is ‘basically about finding a way to sort out the things that are going to last a long time (in computer science) instead of changing rapidly’ - what do you mean by that?

Every day I get about one journal in the mail, not including ITNOW (laughs), but including The Computer Journal. About eight of them arrive in my mail box every week. So there’s an enormous amount of material out there and it’s good stuff. So how am I going to decide what to put into The Art of Computer Programming?

I try to avoid the stuff that’s quickly going to become obsolete and concentrate on the stuff that’s going to have lots of applications. I try to find the facts that aren’t too hard to learn, that are going to be useful for everybody’s toolkit. What should all programmers of the next generation remember? I don’t pretend that I’m right about everything, but I try to sort out the ones that stand out to me, that are unforgettable and that our children should remember.

So I guess the building blocks of computer science and not so much all the more transient add-ons which tend to follow?

Yes, but there are still thousands of add-ons that are describable in a couple of paragraphs, and learnable. If something takes 10 pages to describe then it’s very hard to get it into my book. But if something only takes three pages, is intrinsically useful and I can see how it physically fits in with other stuff then it’s more likely to go in.

For example, we all learned how to add numbers together when we were young. If you think of all the uses to which that skill has been put - it’s incredible. We all use addition every day, in different ways and continue to do so every single day. But still you learned about adding - you learned the concept of adding. There are loads of little concepts like that which go in to my book and that’s what I’m looking for. They haven’t all been discovered yet.

Even with adding and computing there’s now ‘adding without carrying or nim- addition’, which is something that was invented in England 100 years ago. It began as a game, which computers can do well, and we could combine this addition with ordinary addition so one of the things in my new book is to explain to people why we might even want to be teaching fifth graders a new kind of addition because it’s turning out to be quite useful. But it’s not so simple that you can say everything I need to know I learned in kindergarten, we keep learning little things that help us take giant steps as we go.

In 1999, you were invited to give six public lectures at MIT on the general subject of relations between faith and science. Over a decade on, have your views on the relationship between science and spirituality changed and if so how?

I’m just glad to see that people think there’s more to life than things we can understand and it just seems that at the time I gave those lectures it was just coming out of the closet saying ‘well, computer science is wonderful, but it’s not everything and don’t expect me to give you any answers - Let me explain why I think it’s good to still have some things that are mysterious.’

I think there is the tendency as we discover more and more science that we tend to think that now we know everything. But as we think about it more and more we’re just getting started I think. The amount that is changing is happening incredibly fast but still I can see that in 100 years’ time there’s still going to be much more to learn.

So there’s still plenty of room for humility, but we have still learned an awful lot of stuff we can be proud of.

I had this invitation to MIT and I thought, well, once in my life if I ever wanted to reflect on this, this was going to be the time and the place to do it. I don’t pretend to be an expert on it; I just didn’t think people were spending enough time thinking about it. I was glad to see how many people responded to it.

Were the lectures well attended?

Well, that’s the thing - it was standing room only! It was a big lecture hall too. There were six lectures. After the first one it was carried on, on Dr Dobbs tele- webcast and it was downloaded an incredible number of time over the next five or six years.

So it was definitely meeting some kind of a need. I wasn’t necessarily providing the answers, but I was providing some of the questions that I thought were part of our life - why not discuss these things in public? I was very pleasantly surprised at the numbers who came.

A few years back I gave a talk at Google and again it was standing room only, and again it was about this very subject. And it was a question and answer talk like I’m giving for the Turing Lecture. That’s the thing I enjoy, somehow responding to what people ask me more than having a canned thing.

I was going to ask you how you cope with a challenge like that - for a lot of people not knowing what they were going to be asked would be extremely daunting...

(Laughs) - It’s not so hard - if I make a mistake, so what. It’s not stressful compared with, say, if you think about Prime Minister’s questions - I think Barack Obama could do something like that, but I don’t think George Bush could have!

I suppose George Bush must have had his strengths as well as weaknesses..?

Yeah, I suppose so, although I don’t really want to get caught up in a political discussion - I don’t have an algorithm figured out yet for politics!

In 1975 you published a booklet called Marriages Stables, which is a gentle introduction to the analysis of algorithms, using the theory of stable marriages as a vehicle to explain the basic paradigms of that subject. Why did you use that analogy?

It’s a gimmick, but once again it was a series of six lectures that I gave in Montreal. And the theme of those lectures was not faith and science, but looking at the analysis of algorithms and I based it around a mathematical problem I called stable marriages, but you could also think of it as a game between boys and girls where they each have their own preferences and each boy ranks the girls and each girl ranks the boys. And we ask ourselves ‘how can we pair them up according to a number of criteria so that their relationship would be a stable one?’ It’s unstable when there are people who prefer each other rather than their own current partners. There’s always a way to match the boys and girls up in a stable way.

There is good mathematics which can explain why that is true, but also different ways to do it and I can say what method is better than the others and I can introduce the analysis of algorithms into that. And, I guess, the thing I’d most want to be remembered for (going back to that earlier point) is this methodology of understanding the way these algorithms work. So here was a cute question about matching girls and boys up which we could solve through concrete mathematical problem solving.

So basically it just helps people visualise your ideology in their heads?

Yeah. So I could see the average boy is going to have to propose a certain number of times so it’s not going to take long before we have a statistical probability for an outcome. You can ask a number of questions so the girls get the best deal or the guys do. Good mathematics comes up in the process of trying to deal with this one problem.

So it was something I could talk about over the course of six lectures and interact with the audience there. The book (of the lectures) was in French even though I gave the lectures in English and I don’t know French; they decided to make the transcripts of the lectures in French. So this book came out in French and was translated into English 20 years later. And so I now know what it’s like now to write a book in a language that I can’t even speak!

It seems that you do like a challenge, having done a series of six lectures, not once, but twice...

It’s true, once in every twenty years I’m up for a challenge! It’s something different to me sitting at home and having to write The Art of Computing book. And it’s almost 50 years now that I’ve been gathering material for that.

You’ve been asked to do the Turing Lecture this year - is that something you’re pleased about?

I’m at a turning point in my life right now where I’m celebrating the conclusion of two big projects, each of which have taken many years to do. So it’s the perfect time for me to be giving this lecture and I told them two years ago that two years from now would be a good time for me - my defences would be down and I’d be retooling ready for the next big project.

I’ve finished Volume 4A of The Art of Computer Programming, which I just got my first copy of less than two weeks ago, which is something that I’m proud of and I also finished the 8th volume of my collected papers. All of the papers I’d written over the years were packed into books by topic. For example one was on typography; I did research beforehand on software for making books beautiful. Another one collects the papers I did on the analysis of algorithms. Still another one has the papers that I wrote for general, non specialist audiences about computer science.

Plus the last volume of this eight is about fun and games; I was saving it for dessert because these were the papers that I had the most fun writing and that I did purely for the love of it. I love this book on fun and games - I’m not entirely sure why.

On the very same day I finished that book and sent it off to a publisher, The Art of Computer Programming book was also sent off to another publisher and then I received my free copies during the same week - it makes me very happy to have them, in both cases. To have completed them, to have gotten through them without getting sick, a World War breaking out, or anything like that, is very gratifying. It’s also nice to be able to draw a line under those projects too.

BCS is pursuing a professionalism in IT programme - what are your thoughts on this - do you think the industry should just police itself or does it need greater guidance?

I’m a writer of programs and I’m not really qualified to say. England has its long tradition of guilds and masons and it seems to, in the cases that I know of, have led to good quality standards, but there were excesses in certain industries where people would just decide that you had to be employed whether or not you were providing a valuable service or not - which is bad. The fact that nowadays communications are so much better, it’s so much harder to cover up bad practices that were associated with those so called ‘feather bedding’ practices.

Do you think the IT profession has a bad reputation being full of geeks and nerds?

That’s interesting because I was actually proud to be given the name ‘geek of the week’ by a British writer* who has a blog called ‘Geek of the week’ or something like that. I can’t remember his name though. This was about two years ago. They talk about ‘geek chic’ - the word is becoming more acceptable and people aren’t afraid to admit that they’re a geek. Nerd is a little different.

My sense of it is that now that people are identifying themselves as geeks that word has now brought us to an era where I can say it and people can understand the kind of peculiarities I have. It certainly wouldn’t have been that way several years ago.

People who work with words often can’t explain why some words rise and others fall, but this one definitely seems to be in ascendancy right now. I might be wrong about it, but one of the chapters in my book on fun and games is called ‘geek art’ and it just seemed to me to be the right title for it because it talks about the kinds of art I like to have around in my house.

What, for you, have been the most important developments within the computer science arena over the last five or six years?

To me it’s the fact that thousands and thousands of people are working together. At the end of each year you can ask yourself what the greatest breakthrough discovery of the year was and you can’t come up with anything really, but after five years the whole field has changed. And the reason is, it’s all incremental.

There are occasionally things that are recognised later as being major changes such as the creation of the World Wide Web, or something like that, but the year that it happened no one would have recognised it as being so important. Actually it’s more like building the Great Wall of China, with each workman contributing bricks and it’s a team work effort where so many people are involved. It’s all about pushing the envelope and learning from one another, so I think that’s the way I perceive it.

Looking forwards, what do you perceive are the biggest issues / challenges computer science faces, now and in the near future?

The challenge of how can we go to sleep at night when there are so many things left to be done! As an American I’m a big admirer of (and also slightly jealous of) the British health care system, from what I know of it (my grandson was born here), but I think so much more could be done by having better health care records and better ways of describing and visualising combinations of symptoms and combining statistical methods as well as visual methods, this would help doctors to understand things more clearly and quickly.

Not to mention the ways in which computers can help biologists to design better drugs. Everywhere you look there’s something to be done which needs a good programmer to help achieve it. There’s no shortage of challenges and little chance of ever running out of them.

I started off by saying that one person in 50 is a geek like me, but I think there might be a geek shortage in years to come. I might be wrong - the next generation could be 10 per cent geeks, but I doubt it. In order to do these things that computers need to do we’re going to need the people to program and run them who have these weird talents.

Do you think the old artificial intelligence chestnut will ever be solved?

I don’t think we’re anywhere near this singularity, but there will gradually be a coming together of men and machines.

I must say that my colleague, John Hennessy, the President at Stanford [university], has said he thinks there’s going to be a computer meltdown in five years, like the financial meltdown, because we’ve become so reliant on them; people are relying too much on them. There’s definitely going to be a time where people have forgotten how to do things, all because of relying on machines. We’ll forget how to do stuff without machines and that’s going to lead to some crashes.

You once said ‘Everything about computers today surprises me; there wasn’t a single thing that I could have predicted 30 years ago’. If you were to set yourself up as a sci-fi author what predictions do you think you’d make about life in 50 years time?

I’m glad you found that quote. Anyways, it’s true!

Pessimistically, I don’t see how we’re going to solve the energy situation unless there’s something like breeder reactors that dispose of waste properly. There’s something called the Jevons paradox dating back to England in the 19th century. [William] Stanley Jevons, I think his name was.

Someone worked out how to run the railroads ten times more efficiently than they had done in the past and as a` result they used 100 times more coal because everyone then started to use the railroads to transport things. In other words once you made something more efficient then people used it much more.

You don’t just say we need X amount of oil to do what we want, what happens is that if we had more oil we’d be doing more with it now. And what happens is our appetite is never satisfied. Hence I don’t see how our energy requirements will ever be met.

The optimistic scenario is that everyone enjoys doing analysis of algorithms and enjoys doing beautiful computer programs - wouldn’t that be a great future!

* The ‘Geek of the Week’ interview was actually done by Richard Morris on 26 November 2009.

Comments (5)

Leave Comment
  • 1
    Oleg Gerassimenko wrote on 2nd Aug 2011

    Thanks for reminding me it's time to re-read 'The Art...'. Great book.

    Report Comment

  • 2
    TX CHL Instructor wrote on 2nd Aug 2011

    I had the great pleasure of meeting Dr. Knuth back in the mid-90's when he spoke at a bookstore in San Jose at the same time I happened to be teaching a C++ course for Technology Exchange Company (now defunct). As a co-moderator of comp.lang.c++.moderated since its inception, I happened to see an announcement of the event before it went out to the public, and was able to modify my flight plans to stay over that Saturday to hear the talk. I even managed to take my old copy of Fundamental Algorithms and get it autographed by Dr. Knuth. I was glad that I got there early, because they had to close the doors because too many people showed up.

    He is a very entertaining speaker.

    Report Comment

  • 3
    Dan Sutton wrote on 2nd Aug 2011

    Dr. Knuth's books are beautiful, and should be required reading for all programmers. So much of what he says, even in this short interview, resonates so deeply with me, as a programmer.

    Dr. Knuth is wrong to be taken aback at being compared with Feynman and Einstein: in reality, history will confirm that both he and Dijkstra occupy the same, rarified level of mental achievement.

    Report Comment

  • 4
    Richard Overill wrote on 10th Aug 2011

    I think Don Knuth comes across much better here in this interview than at the live Turing event. It was also a privilege for me to present him with a copy of a paper of mine in which yet another algorithm bears his name - this time the Knuth-Eve algorithm.

    Report Comment

  • 5
    Sea Turtles wrote on 15th Nov 2012

    Im 13 and know how to program a video game! W00T

    Report Comment

Post a comment