As people get excited about the latest idea for "big data" and the "internet of things", computer people often shake their heads, saying "it won't scale." Pessimism isn't always justified: we have been able to scale up quite a number of tasks, from connectivity, through searching, to social media. However, other applications are recalcitrant, from energy management to medical records.

The conventional computer-science view is that the scaling of systems is all about computational complexity; about whether the storage or communications required for a task grow more than linearly in the number of users. Over the past thirty years a good theory of that has developed, but it is as-yet nowhere near enough.

Prof Furber argues that the real limits to scale are usually elsewhere. Even where the data are manageable and the algorithms straightforward, things can fail because of the scaling properties of the social context, the economic model or the regulatory environment. This makes some automation projects much harder than they might seem.

Likewise, when it comes to safety and privacy, many of the attacks that are easy to do in the lab are rare in the wild, as they don't scale either. However, others surprise us: no-one in the intelligence community anticipated a leak on the Snowden scale.

In short, scaling is now a problem not of computer science but of systems engineering, economics, governance and much else. Information systems fail more often because of perverse incentives than because of poor design. Conceiving problems too narrowly makes failure likely, while good engineering will require ever more awareness of context. The implications for research, education and policy bear some thought.

Watch Professor Ross Anderson's 2016 Lovelace lecture

Welcome