Standardisation is a central feature of the IT discipline. It has a complex interaction with innovation at both a technological level and in terms of market acceptance.

First, some markets are facilitated by standardisation. The GSM standard enabled the building of the mobile phone market and created a vibrant and competitive market. When there are standards competing, such as HDD and blu-ray, the market can often be sluggish till there is a winner. Business and consumers can be reluctant to invest in a dead end solution. Very frequently, a de facto standard can produce dominance or near monopoly. A good example of this is the dominant pdf format.

At the same time, standards can be a barrier to innovation. The ubiquitous QWERTY keyboard is probably the outstanding example of this. Invented for an age of mechanical typewriters, its justification has long gone. Attempts such as the Dvorak layout have made little impact.

The early visual displays used an 80X24 character screen. The 80 characters was a relic of punch cards. The first chips inside CRT/VDU screens was 2K. The impact of legacy systems can be such that these restrictions last way beyond reason. My first project as a project manager had 2Mb chips but we were working to 24X80 screens.

Normally when we think of the limits of computing in futures thinking, then people look at physical, i.e. atomic limits. What QWERTY keyboards remind us is that a dominant standard can be a barrier to innovation for a very long time.

This was brought home to me by a consideration of electronic patient records. There are many interesting, and indeed controversial aspects of EPR, which I will not go into here.

However, what I think is overlooked is that patient records are for life. For a child born this year, we need to plan for some of that generation to reach an age of 120. That is twice the effective age of the modern computer industry. So, records we create now will have to be accessible in 120 years time!. Look back over the last 60 and you will see how hard we have made it for that to be the case. Imagine the size of medical data sets for hundreds of millions of individuals.

This is one example of the challenges of long-term preservation of digital artefacts.

As the scale of data on the net grows in ever bigger chunks, then will data formats become an increasing barrier to innovation? It seems unlikely that we will be able to convert these huge data sets periodically so the ability to read old data means that levels of compatibility will need to be maintained at scale for periods that our discipline has no experience of.

The expertise in long term preservation lies with museums, galleries and archives.

We have already lost a whole swathe of TV and radio programmes which have not been preserved. There has been no systematic preservation of the history of video gaming. Of course, the preservation of software requires preservation of the hardware on which it runs.

The internet archive does a valiant job of preservation, but is far from complete. So what will generations in the future find as our legacy? For an industry that has become as pervasive as IT, so fast, we actually risk leaving little evidence for future scholars and future generations with which to explore how we built and established the IT disciplines.

So, one way to preserve our legacy is for data standards to become a barrier in the way that QWERTY has been. Is that likely? Is that what we want? For those attending a standards working group in data standards in the next few years, what time horizon should they look forward to ensure that we do not create barriers for ourselves that we might have foreseen with a little thought?

Or, if that is too hard a task, will we create a digital dark ages, where little of our lives is preserved for future generations?

Imagine going to a major museum 200 years into the future and discovering that the years 1980-2020 are seen as a lost time with little record of how we lived and what we achieved.

Are you happy with that?