Moore’s Law is part economic, part technology and possibly part psychology. Because people believe in it, investment in R&D and manufacturing has helped perpetuate the cycle for 50 years. The problem, for me, is that some people like to extend Moore’s Law beyond hardware into software and particularly AI. Much as I love PowerPoint, I’d find it hard to justify an improvement of several hundred thousand percent since its introduction, for instance.
The move from the current 14nm scale to 10 nm is throwing up challenges as did the previous step change. IBM has recently announced 7nm in the lab, so the technology path for the next decade is now in play.
Back in the 1990s, one of my colleagues at ICL showed me a simple model he’d built which showed an interesting puzzle. Sometime in the 2020s, there would be an interesting challenge. The cost of fab plants was going up with each generation. The lifetime was stable or shortening and the price performance was improving at Moore’s Law pace. Reductio ad absurdum led to the conclusion that there could only be 1 fab plant in the world that would be profitable over its lifetime unless the number of chips sold itself followed Moore’s Law. Even with the projections for the IoT the growth in volumes, though still impressive, is far from that.
So, why has Moore’s Law worked for 50 years? Some argue that IT is unique because we build computers that help us build computers. Historically many successful technologies have followed an S-curve in price performance and the appearance of exponential improvement has lasted for a short while. I remember a lecture in the USA where someone showed that in the 19th century steam turbines in shipping followed that pattern for a few years. So I argue, a period of exponential change is not unique to IT, but the longevity certainly is.
Now where next? Will a future CEO announce that it’s moved to say five years? If so, the familiar S-curve of other technologies will be looking more realistic than perpetual exponential growth.
At Cambridge, I was in a workshop in 2000 where someone with far greater knowledge than I suggested ‘Moore’s Hiccough’ around 1nm and eventually ‘Moore’s wall’ where new materials, architecture and processes would be needed.
Some advocates of the technological singularity use Moore’s Law to describe the horror / joy of a world where rampant AI will pass us mere carbon lifeforms and a single machine will have more capability than all humans put together a few years after.
The argument runs that we will experience more change in the next 20 years than we have in the last 200 years in all walks of life. These utopian or dystopian claims have echoes in history. Life tends to be more complicated than either vison would suggest. Language is a powerful tool, ‘Moore’s Rule of Thumb’ would not have generated the passion it does in some of its devotees.
In the words of the song ‘It ain’t necessarily so’.
I hope you’ve had, or shortly will, a good summer break.
About the author
Chris Yapp is a technology and policy futurologist. Chris has been in the IT industry since 1980. His roles have spanned Honeywell, ICL, HP, Microsoft and Capgemini. He is a Fellow of the BCS and a Fellow of the RSA.