The future’s... analogue?

‘A trend is a trend until it bends,’ is a wonderful quote from Ged Davis of Shell.

One of the great difficulties with understanding possible futures is that the emergence of new concepts and discontinuous change can overwhelm established ways of thinking. Trends and relationships that have worked for years break down and turbulence can rein or appear to do so.

We live in a world where more and more things are becoming digital. Consumer appliances, cars, books and TVs have all become ‘digital’ artefacts. When we think about the future of computing we tend to start forecasting as if this trend to digitisation is ‘unstoppable’. It assumes the status of a physical law like gravity.

If history teaches us anything then surely nothing is forever. So, when and how will digitisation break down? What will usurp this trend?

In a lot of my work over the last 20 years I have compared the era in which we live to that of the European Renaissance period of the 15th and 16th centuries, rather than the industrial revolutions of the 18th and 19th centuries.

The invention of the Gutenberg printing press allowed for the dissemination of new ideas and the transmission of information that challenged the power structures of the time. The Renaissance saw not just an explosion in information within society, but the proliferation of new concepts that we still live with. Music notation, perspective in art, double entry book-keeping, humanism freedom, republicanism and many other aspects of modern social and economic life emerged in the Renaissance. It laid the basis for the Enlightenment. Above all, it challenged the power structures within society and led to a flourishing of human creativity.

Scotland played a central role in the development of the Enlightenment and there are many Scots thinkers who wish to see the 21st century as a ‘New Enlightenment’. The recent plenary of the International Futures Forum discussed this issue.

Today we are living in a world of explosive information growth and challenges to the world order.

Concepts of the ‘old world’ are breaking down. Let me illustrate with an economic example. It has been a given for many decades that economic growth is a good thing and leads to greater human happiness and so on. The evidence for this not being so has been growing for years; politicians around the world and across the spectrum no longer see GNP/GDP growth as the central measure of their aspirations. The impacts of climate change, global competition from the BRIC countries, just to mention two factors leave planners and forecasters with greater sense of uncertainty than at any time since the late 40s.

It is hard to see across a conceptual chasm. Imagine trying to forecast the future of physics in the late 1890s just before relativity and quantum theory. Imagine trying to predict the future of art, 5 years before the discovery of perspective. These are problems we all live with.

Edward De Bono invented the concept of ‘Po’, to help people think about areas where new concepts might be needed or might emerge. It is one of few tools that help in this area.

Let me add one oddity, which is our notion of progress. If you look at the beautifully illustrated manuscripts in the British Library for instance from the 14th century you notice something odd. It is only in the later part of the 20th century that we could create a book of the same quality by the printing process. Some will argue we still haven’t.

What might have been the life’s work of a monk could be replaced by a printing press that enabled great productive increase but lost much by way of art and craft.

Whereas the industrial revolutions were about analysis, simplification, the division of labour etc., the Renaissance is embodied in the notion of the Renaissance man.

So, now we find highly creative people with design skills, technical skills, music, video working on computer games, augmented reality and other digital media.

What caused me to reflect on this was looking at a whole raft of material about the next generation of technologies under the umbrella of ‘the internet of things’ and sensor nets.

What interested me was that many of the ideas are essentially analogue not digital. Even on location-based technologies the world seems analogue. For instance, ‘Find me a French restaurant nearby’ is not a digital problem. Sure, it can be modelled on a digital computer. There are ‘digital smell’ technologies but smell is again, for me, an analogue property. Other examples are around pattern recognition ‘looks like,’ ‘feels like spring’ and ‘sounds like’.

I am on the side of those who argue that the human brain is not (merely) an information processing system, in the way that a digital computer is.

Today, many would argue that digital won over analogue, and that battle is over. Perhaps not!

As with the book, advance in one area was accompanied by loss elsewhere, the end of the digitisation trend for me will see a renewal of interest in analogue concepts.

My watch is hybrid and has both digital and analogue components. Sometimes I want to know that it is 10.29, other times nearly 12 is OK.

I had a friend, a great technophile, who had perfect pitch. He refused to buy CDs and stayed with vinyl. Even on very high quality digital music systems he found digital music grating. It took me a long time to realise that this was not an affectation. Even my untrained ear can tell the difference between a digital recording and being in a room with a grand piano being played,

So, my guess is that the future will turn out a hybrid of digital and analogue. It will be Renaissance men and women who will open up the new concepts that lead us to 21st century enlightenment.


Comments (10)

Leave Comment
  • 1
    jsc42 wrote on 10th Jun 2011

    If any proof of the limits of Digital were needed, try automating writing comments on this site. The 'Captcha' required to authenticate users exploits the fact that we live in an analogue world where we can use approximations and advance pattern recognitions in ways that are still inaccessible even to very powerful digital machines.

    P.S. [Slightly off topic] I like the tag line on the Captcha box "Stop spam. Read books."

    Report Comment

  • 2
    Martin Brown wrote on 10th Jun 2011

    I do not think analogue is ever likely to make a comeback - although I might make an exception for FM broadcast radio because they made such a momumental mess of the DAB standard that it is not fit for purpose. Insufficient bitrate, random delays in the various decoder implementations and a tendency to crash when it rains.

    One of the best features of digital data storage is that it can be copied quickly and easily and with terabyte class disks now so cheap this allows immense amounts of archive material to be stored. The big problem with digital data is that unless you index it and also store the programs to access the archive you end up in the bind that the BBC hit with the Doomsday disk after only a few decades. By comparison ink on vellum has demonstrated and proven longevity when properly stored and requires no special reader software, The storage density is no great shakes though.

    I can only assume your friend with perfect pitch enjoys the wow and flutter of analogue turntables which although small was ever present in the mechanical belt drive. Digital music on CDs might just possibly lack very precise timing edges on transients, but such things when reproduced on vinyl had a bad tendency to rip the diamond stylus out of very expensive cartridges. There was one (not all that good) performance of the 1812 in the late 1970's that actually featured an electron micrograph of the relevant piece of cannon firing groove on the cover. Play it at your own risk.

    The differences in playback of recorded material are almost always limited by the quality of the loudspeakers used. And in the old days of black vinyl by the cartridge and tone arm and incredibly tetchy low signal levels that would pick up mains hum and suffer acoustic feedback at the slightest provocation.

    A real piano in the room sounds different - yes. That is because it is a large physical object full of resonant strings. It is unreasonable to expect a small pair of speakers to completely replicate the whole musical experience but a decent hifi stereo does a pretty good job.

    Machine vision and pattern recognition is one of those "easy" sounding problems that is in fact incredibly difficult. The easiest automated way to beat Captcha is to screen grab and then present the test as a challenge to another human usually offering some small reward for success.

    Report Comment

  • 3
    Daniel Bigg wrote on 12th Jun 2011

    'I am on the side of those who argue that the human brain is not (merely) an information processing system, in the way that a digital computer is'. Totally agree with this one, in the brain both the information and the encoding (i.e. the way in which the information is represented) is changing constantly. That means two things:

    1. Sorry 'Ray Kurzweil', you won't be able to transfer encoded information between a brain and a computer.
    2. If we want to see real AI we are going to have to look at analogue hardware based on biological neural networks.

    Only time will tell if this is another 'the world only has a need for 5 computers' prediction!

    Report Comment

  • 4
    Iain McKenna wrote on 17th Jun 2011

    @Martin - analogue never went away, therefore doesn't need to make a comeback. Some things are better suited to analogue, such as music, time etc where using any sort of sample rate required for digitization can only lead to a loss of quality. For other things there is digital. The only reason people feel that digital is better for music, as an example, is because the highest quality analogue reproduction equipment is expensive and beyond the limits of what most people are willing to spend.

    Fortunately for me, my brain is able to process both analogue and digital inputs (as I believe most peoples' can) - therefore I can enjoy the most appropriate storage and reproduction. Of course when no reproduction is good enough one just has to seek out the real thing :)

    Report Comment

  • 5
    Prof Bill O'Riordan wrote on 17th Jun 2011

    Heartening to see that most people believe the world is analog and the tools we best use to analyse, manipulate and exploit this world are digital. I wonder at what point in time will the delta/digital sample be so small that there is complete convergence. You will remember Chris (in the good old days) that when one of our research projects built a biological NOR Gate with an almost immeasurable half life (very short) that this was a great debating point that involved many, many "pints of liquid refreshment. We never answered it really!

    Report Comment

  • 6
    John Castleford wrote on 19th Jun 2011

    The digital world has not reached its full potential. But there is a theoretical and actual limit to binary duality. Why can remember analogue computers and the enticing prospect of fuzzy logic? Analogue never did go away as Daniel and Iain eloquently state: what price emotionally driven insights - especially as emotion, rather than intellectual logic is the primary driver of so much of human behaviour and thought?

    Report Comment

  • 7
    Ollie Lowson wrote on 20th Jun 2011

    Thanks for the article. I remember some quite heated discussions about the relative merits and demerits of the CD over vinyl going on in our physics class.

    The thing that sells it for me is that the inputs to the system that is the human body are purely analogue, as such, any digital thing that wants to interface with the human body must at some point involve a digital to analogue conversion, and this conversion can never be perfect - all that we can do is make the losses more and more imperceptible, effectively converging on zero, but always there. A similar problem exists in converting real-world inputs, such as sound, which is a wave, into a digital format. There will always be approximation, all that you can do is increase bit-rates and sampling rates to get as close as possible to the real wave.

    Every new advance in computers, especially consumer computing, seems to be geared towards making interfacing with computers more analogue, making computers seem more analogue - maybe one day we'll go so far as to make them actually analogue, but for now we'll have to make do with 'multi-touch' screens and 'lossless' digital audio.

    Report Comment

  • 8
    Paul Gray wrote on 21st Jun 2011

    I have watched the ongoing debate of digital verses analogue music capture and reproduction with interest over the past few decades.

    It was always the relative durability of CDs over vinyl that clinched it for me. The thought that each time I took my precious record out of its paper sleeve and played it I was actually degrading the recording itself was anathema to me. Cassette tape helped but that was ultimately another step down from the original recording.

    So this made it easier for me to accept CD encoding, with its added “dither” and “error correction” strategies.

    After all, owning a vinyl record collection is rather like being a werewolf in one of those old Lon Chaney Jr movies. You always end up destroying the ones you love most.

    Report Comment

  • 9
    Martin Brown wrote on 22nd Jun 2011

    I think I have to make a stand for digital signal processing technology here. Yes it is true that there is always some digitisation error in converting from an analogue signal to a time sampled digital one - but the errors are now negligible in all properly designed studio gear and more than adequate for domestic use in sound reproduction.

    MP3 is a bit borderline but fit for purpose in the context of music on the move. DAB is a failure but that is due to political "never mind the quality - feel the width" motivations rather than any limitations of digital technology.

    The bandwidth and noise floor for digital sampled data storage can easily be made well below anything that analogue methods could possibly manage. It is not for nothing that all major scientific experiments digitise their data as soon as is practically possible once the signal has been aquired. There is no competing analogue recording system that even comes close.

    The limiting factors in sound reproduction are invariably the loud speakers and their position in the room. Although audiophools would have you believe that only copper wire hand platted by mermaids in an oxygen free environment gives that "authentic" sound.

    Nobody who has ever used analogue storage media or even hairier analogue computers would ever want to go back to those dark days of patchboard equation solvers with horrific thermal drift and noise.

    I challenge those of you that think analogue black vinyl discs are truly superior to dig out your old turntable and try using it again! If you work really hard at it you might on a very good day match the quality of a CD at least on the very first tme that you play a brand new LP. Wear and tear inevitably takes its toll with a mehanical stylus. And if you are a super rich Luddite then there is always the laser read non-contact ELP from Japan for a price (basic model currently from about $10000).

    Error correction on digital media is a huge benefit too since if done properly any minor local damage can be fixed. The same damage on an LP causes a loud click, or worse still a skipped groove or loop. I don't miss that at all. YMMV

    Report Comment

  • 10
    Allan Dyer wrote on 23rd Jun 2011

    I'd like to contest that the human brain and senses are analogue. Nerve cells "fire" with the breakdown of chemical imbalances across the cell membrane. When a cell fires, the breakdown propagates along the fibre, eventually stimulating the release of chemicals to affect the next cell. The intensity of the signal is represented by the frequency of firing, up to the point where the maximum firing rate of the cell is reached.

    The eye is a nice example: there are four distinct sensors (one for low light sensitivity, rods, and three types of cone for three colours), yet we perceive fine gradations in colour based on the relative rates of firing of each sensor. Our perception of the analogue world is a construct of our digital mind. We cannot even know how other minds are perceiving the world... maybe you are colour-blind and have only two type of cone, or a bird, with four (and an ability to see ultraviolet).

    Smell is based on the interaction between a scent molecule and a particular receptor molecule. Either the interaction occurs, and triggers the nerve, or it doesn't, a digital event. We have many types of receptor molecules, to detect different chemicals, dogs have far more.

    Everything is an approximation; our machines present an approximation to our senses, our senses present an approximation to our mind. When the sampling error of our digital machines is too large, we notice: we see the jaggies and the colour bands in a 640x480x256 bitmap. Analogue has an advantage over low resolution digital because we cannot point out the transitions, even if the overall accuracy is the same. When the digital sampling error is reduced enough, you won't be able to tell the difference, but your parrot might.

    Report Comment

Post a comment

About the author
Chris is a technology and policy futurologist. Chris has been in the IT industry since 1980. His roles have spanned Honeywell, ICL, HP, Microsoft and Capgemini. He is a Fellow of the BCS and a Fellow of the RSA.

See all posts by Chris Yapp

Search this blog

November 2017