We all know there's a crisis in university computer science departments. Student numbers are dwindling - down 115 just last year.
At the same time the computing unit of funding has fallen. And the onset of fees has made students think twice about joining a profession where the plethora of new jobs in the 1990s has reduced to a trickle and it's only just looking as if employment prospects may be on the upturn.
Dropping numbers of A Level students, a view that IT is a job for geeks and social misfits and a perception that there’s nothing interesting in computer science doesn’t help. Even the value of the research base is being questioned.
And the problem's global. In the US, the number of students choosing computer science dropped by 39 per cent between 2000 and 2005. In Australia, cuts in IT academic staff are the order of the day.
In such dire circumstances, it's tempting to hanker after the glory days when computer science ruled, departments were full, and students flocked to a leading edge discipline where the ideas were fresh.
It's easy to be nostalgic about the days when the income from computer science subsidised other departments and computing was the Prince of the university not the Cinderella.
We long for the days when assembler programming ruled, when programming was exciting and leading edge, when distributed computers were being created and there were uncharted vistas of applications to be written, and single applications such as ledgers and transaction systems transformed businesses. But that is the past. Today the ship is holed below the waterline.
As the ship sinks, we computer scientists fiddle on the deck hoping to avoid the icy waters. We claim, as the President of the BCS has recently, that there is still a massive need for computing students in the UK today.
We look to games programming for our salvation, designing games programming courses and reducing a wide-ranging industrial discipline to a set of geeks programming computers to zap spacecraft and dismember aliens.
It's a sorry sight to see computing academics fighting for the last few lifeboats. But the heyday of massive liners, full of programmers, plying the commercial sea-lanes is over. There may be room for a few luxury liners, but most of us fly on budget airlines.
It's easy to think that the problem is that people (read potential students) just don't understand how exciting computing is and that this can be fixed by a bit of sharp marketing, slick videos and some school visits. But the students are not that gullible. The real nature of the problem lies at the roots of the discipline.
Something significant has changed. There is the smell of death in the air.
In the early days, computer science was populated by mathematicians and physicists excited at the prospect of vastly accelerated computation. New languages were developed. FORTRAN, Algol, COBOL, and PL/1 took root. The foundations of programming were laid.
There was excitement at making the computer do anything at all. Manipulating the code of information technology was the realm of experts: the complexities of hardware, the construction of compliers and the logic of programming were the basis of university degrees.
The power of hardware has increased, as IBM 370s in air-conditioned warehouses gave way to computers in the home and advanced robots become this year's Christmas toy.
However, the basics of programming have not changed. The elements of computing are the same as fifty years ago, however we dress then up as object-oriented computing or service-oriented architecture. What has changed is the need to know low-level programming or any programming at all. Who needs C when there's Ruby on Rails?
Now vastly complex applications for businesses, for science and for leisure can be developed using sophisticated high-level tools and components.
Virtual robots - Zooks - can be created by eight-year olds without needing programming, logic or discrete mathematics skills. Web designers build complex business sites, graphic designers build animations, accountants assemble business systems without needing to go object-oriented.
Computer science has lost its mystique. There is no longer a need for a vast army of computer scientists. The applications, games and databases that students once built laboriously in final year projects are bought at bookshops and newsagents.
If the gap between public knowledge and academic curriculum isn't large enough, the gap between academia and industry practice is a gaping hole. While academic departments concentrate on developing new computer systems in an ideal organisational environment, a lot of industry has moved away from in-house development to a focus on delivering a service.
As commercial software products have matured, it no longer makes sense for organisations to develop software from scratch.
Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available. IT departments now focus on contracts, tenders, service level agreements, training, system usage and incident management.
Interrupts, loops, algorithms, formal methods are not on the agenda. IT is about deploying resources to meet the information needs of its customers.
Implementation, facility management, systems integration, service management, organisational change even environmental audit are the language of IT. These hardly feature on computer science courses.
The environment within which computing operates in the 21 century is dramatically different to that of the 60s, 70s, 80s and even early 90s. Computers are an accepted part of the furniture of life, ubiquitous and commoditised.
Like cars, a limited number of people are interested in their construction, more live by supporting and maintaining them; most of us accept them as a black box, whose workings are of no interest but which confer status, freedom and convenience.
Indeed, whereas building a new car needs mechanical know-how, building a new computer application can be done by the user who has no grounding in computer science.
Computing is also affected by globalisation. The loss of jobs in IT and the declining computer science enrolments is a global problem for developed countries. Since the software product can be transmitted almost instantaneously, why develop it in expensive facilities in the west?
Armies of highly trained computer scientists are available in India, Sri Lanka and China. The expertise is easily off-shored. In India, over 100,000 new IT graduates a year are ready to support an off-shored IT industry.
Companies like Microsoft, Hewlett-Packard and Siemens have well established software development operations in India. Why are we not co-operating more with the Indian IT industry?
So where does that leave computing departments in universities? Do we pull up the drawbridge of the castle of computational purity and adopt a siege mentality: a band of brothers fighting to the last man? Or do we recognise that the discipline is dying if not actually dead, and breathing shallowly.
The old man has run his race well. He has changed the nature of human existence. Do we let go gently, realising that the discipline has run its course? The actual problem is one of the perception of the proponents of our discipline, not the potential students.
The old generation needs to look to a new generation, to new approaches. The focus is moving away from system construction. The jobs are in the application of technology. There is a need to be closer to the application, closer to the user, to replace a reductionist, convergent discipline with a complex, divergent discipline.
The complexity of embedded systems, of modern computing applications requires a different way of thinking. A reductionist, programming mindset does not adapt well to uncertainty, emergent behaviour, the unexpected and the study of the whole.
Relationships are important. The new computing discipline will really be an inter-discipline, connecting with other spheres, working with diverse scientific and artistic departments to create new ideas. Its strength and value will be in its relationships.
There is a need for innovation, for creativity, for divergent thinking which pulls in ideas from many sources and connects them in different ways.
The new computing department will be the department of interdisciplinary studies, drawing ideas from biology, design, history, medicine and contributing a rich computing foundation to those disciplines. It will be looking outwards rather than inwards, concerned to address the vast landscapes of computing application.
So how many computer science departments will exist in 30 years time? Perhaps a few will support the elite luxury liners. Most will have given way to interdisciplinary study departments, and computing service departments, producing innovative graduates who can corral and manage the IT resources organisations need.
Computer science curricula are old, stale and increasing irrelevant. Curricula needs to be vocational, and divergent, widening the computing student's view of the world, not creating a sterile bubble, closed off from the wider issues in the world, and from the networking, the integration, the global reach of computers.
There is a need for a drastic rethinking of what the discipline is about. There is a need for new curricula which represents a real paradigm shift, not just a move from keyboards to pen computing.
Here at De Montfort I run an ICT degree, which does not assume that programming is an essential skill. The degree focuses on delivering IT services in organisations, on taking a holistic view of computing in organisations, and on holistic thinking.
Perhaps this represents an early move towards a new kind of computing discipline. As the roots rot and the tree falls a vast array of new saplings appear. Those saplings may be the start of a new inter-discipline: new computing for the 21st century.
Neil McBride is a principal lecturer in the School of Computing, De Montfort University.