When it comes to preparing us for a computer-driven society, IT education has a patchy past record. Author and software developer, Dr. Karl Beecher, explains how computational thinking proposes a better way to prepare us for the digital world and take advantage of the opportunities that lie ahead.
The story so far
Computers invaded schools several decades ago. Since then, they’ve become a fixture of education, particularly after the personal computer boom of the late 1970s and 1980s. This invasion reflected what was also happening at the same time out in the real world. In science, business and industry: the computers were taking over.
‘Computational thinking will be a fundamental skill used by everyone in the world by the middle of the 21st Century.’ - Jeannette Wing
Since the dawn of the computer age, education authorities have tried different ways of teaching students digital literacy. The 1980s saw procedural thinking being taught using Logo programming to control the legendary Turtle.
The outcomes were disappointing. In the 1990s, they dumped the coding and taught the use of Microsoft applications instead. This showed children how to become passive users of IT.
With the new century came the return of the active, code-drive approach in the form of the Teach Kids to Code (TKTC) movement. This advocates teaching coding skills using languages like Python - a relatively friendly programming language - and snazzy, colourful environments like Scratch! where learners can intuitively snap together blocks of code.
Despite being, some commentators say, a well-intentioned leap in the right direction, TKTC has received its fair share of criticism, with commentators claiming its material to be superficial and its focus (on code) to be misplaced.
What’s more, some industry leaders in IT are unimpressed by the TKTC approach, claiming it to be good for little more than producing code monkeys who lack deep understanding of computing and programming.
It’s hard to conclude any of these approaches have been highly successful. Today, government and industry complain about a basic lack of digital skills among the population, which translates into quantifiable losses.
‘This gap in digital skills is costing the UK economy an estimated £63 billion a year in lost additional GDP, and 93 per cent of companies say that it is already affecting operations and recruitment.’ - Business, Skills and Innovation Committee, 2016
Computational thinking: an alternative
If you read numerous critics of IT education past and present, you might notice some patterns emerging. One recurrent criticism claims that no form of compulsory computing education has yet managed to teach digital skills as transferable skills which may be applied outside the immediate vicinity of programming.
Techniques tried so far largely fail to get across that computing skills are centred, not around tools or coding, but around problem-solving. In other words, we’ve been teaching the tool instead of how to use the tool well.
This criticism often accompanies a proposal: that teaching computing should be primarily about teaching the deep skills behind the tool. In other words: the techniques for solving complex problems and expressing solutions in a form so a computer can carry them out. This idea has become known as computational thinking.
‘Computational thinking is more than programming, but only in the same way that language literacy is more than writing.’ - Mitch Resnick
You can think of computational thinking’s motivation in the same terms as language literacy or musical competence. Learning to read and write doesn’t automatically make you an effective or persuasive communicator. Learning musical notation alone doesn’t mean you can compose your own music or even learn how music really works. Those latter skills require deeper learning.
To teach computing literacy, computational thinking takes several topics important to computer science and presents them as aspects of a predictable, repeatable problem-solving technique. Those topics include:
- Logic and algorithms: The nuts and bolts of any computer-based solution. They tell us how computer brains work and thus what kinds of instructions we can give to a computer. Learning logic and algorithms teaches us the correct way to communicate with a computer, because we can’t use human-style communication;
- Problem decomposition: This shows students the importance of breaking a problem down into a set of smaller sub-problems. A large problem is hard to solve in one fell swoop. This can overwhelm and discourage;
The key to solving it is to break it down into a set of smaller sub-problems. Perhaps some of those sub-problems are also too big to solve, so they should be broken down into sub-sub-problems, and so on. Eventually you have a kind of hierarchy where the simple problems at the lowest level indicate a set of tasks you need to carry out to solve the original large problem;
- Pattern recognition: A technique that simplifies the eventual solution and avoids the need to keep re-inventing the wheel;
- Generalisation and abstraction: By picking out the relevant patterns, you learn which details are irrelevant to your solution. Focusing on the relevant patterns allows you to build up a model of the objects in your problem (a.k.a. abstractions) which only contain the necessary detail. Unnecessary detail gets left out, making the solution both easier to solve and also applicable to other similar situations;
- Solution evaluation: Once you’ve built a solution, you have to be able to evaluate it and decide whether it’s acceptable. Does it actually solve the original problem? Does it perform acceptably? Is it secure? Is it easy to use? Evaluation skills give you the questions you need to ask and the means to answer them.
Computational thinking is starting to make inroads into education, but only in certain places and it is still very early days.
Computational thinking advocates argue that teaching the subject will have benefits for both the IT industry and society as a whole.
In the case of young people, it will prepare them by giving them a problem-solving toolkit that is ready for a digitised world. It’s an approach that is predictable and repeatable and enables people to:
- Learn how to think of problems in general, abstract terms;
- Be able to turn those ideas into instructions a computer can understand;
- Realise that a key part of handling problems is to break them down into solvable pieces rather than be overwhelmed by complexity.
People armed with these techniques will have a plug-and-play, problem-solving toolbox for the 21st century. Solutions created using such an approach will be ready to be plugged into computer technology and executed automatically.
The potential for the IT industry itself is equally encouraging. As IT technology continues to infiltrate so many professions, investing in computational thinking today prepares us for an exciting and uncertain future full of opportunities, one that’s best summarised by this quote:
‘Pick any field, X, from archaeology to zoology. There either is now a ‘computational X’ or there soon will be.’ - Stephen Wolfram
Biology, climatology, medicine, social science, law, farming..., the list goes on and on. We can only imagine the rich array of cross-collaborations between all these areas waiting to emerge. But these collaborations will surely present a range of new opportunities for the IT industry. We should prepare for tomorrow.
Karl Beecher is an author and software developer. Before moving into academia, Karl worked as a software engineer and in 2009, he was awarded a PhD in computer science. He worked at the Free University of Berlin before returning to industry to co-found Endocode, an IT services firm. In 2014 Karl published his first book, Brown Dogs and Barbers. His most recent book, Computational Thinking, A beginner’s guide to problem-solving and programming is available from the BCS bookshop.