Paper tigers take on real world

Cartoon image of a tiger chasing a reindeerA few years back, paper examinations came under fire for dubious effectiveness in testing candidates' IT ability. Today, many certifications in practical subjects include simulated computer exercises but is even that enough? Should candidates be taking examinations in real world conditions, using real servers and networks? Gary Flood investigates.

Remember the paper Microsoft Certified Systems Engineer row in the late 1990s? Despite the backlash then against what were held to be essentially meaningless qualifications, there are still real question marks over the value and significance of all the major qualifications out there.

'Five to ten years ago there were unquestionably too many paper qualified people flooding the job market,' confirms Rob Chapman, managing director of training outfit, Firebrand Training (previously the Training Camp), which claims to be the single largest entity of its type in the country. 'That has abated, yes, but there are certainly some qualifications out there that could still be called into question.'

The issue is that what an exam tests is the ability to pass an exam, and not 'real world' abilities (see box). It's not inconceivable that, given the right set of books and online 'resources', a candidate with essentially no real IT ability could pass a couple of the more popular examinations and compete in the job market.

How far they'd get is a moot point. The problem is that if certification is an invalid process, why should punters bother undergoing it, training professionals endorse it and ultimately employers buy in to it?

The defining characteristic, it seems, is that certification must test actual ability to perform a task, not just tick some boxes and get marks based on statistical chance. The charge is that not enough exams do this today, which IT companies adamantly refute.

'Any holder of a CCNA qualification [the major Cisco competency] has worked on real world issues and environments,' was the emphatic response of Jane Lewis, education manager, Cisco UK and Ireland, to any suggestion otherwise. Lewis explains that all candidates for the qualification spend 'at least 50 per cent' of their time in exam conditions working on problems on networks and routers linked to Cisco's 'real systems'. 'This is a truly hands-on examination process,' she insists.

Microsoft was unable to offer a specific spokesperson to debate these issues with IT Training, but the objective evidence is there that the company is equally committed to a significant practical element in its education.

That may of course reflect the charges alluded to above - that for a couple of years after its introduction in the mid-1990s, the MCSE (one of the most sought-after Microsoft qualifications) was a bit too easy to put on your business card. Microsoft has in response been seen to have significantly toughened up the exams and introduced a healthy element of practical testing.

This testing revolves around simulation - making the candidate sit down in the exam room and confront a PC that they have to actually do something with, e.g. install a service or run a diagnostic. 'Microsoft deserves praise here,' notes Chapman. 'It brought in adaptive testing and scenario-based examinations using the virtual PC approach. In effect you have to both answer the paper questions but also show how you would do things in a simulation.'

But the key word - for some people at least - in that sentence is going to be 'simulation'. Is the provision of testing conditions based on pre-set up scenarios enough to really stretch the person trying to prove to you, me and his next boss that he can actually do the job?

This isn't a rhetorical question, as at least one major software organisation, Novell, says more is needed.

When the company absorbed Linux specialist organisation SuSe a couple of years back it also inherited a particular way that organisation had approached the way it granted qualifications. Welcome to the world of the 'practicum', as explained by Novell's head of education Stephen King: 'We wanted something that wasn't the traditional multiple choice or fill in the blank type test. We offer a real test, where you have to show what you can actually do.'

Thus anyone who lists either a 'CLP' or 'CNE' on their CV has passed just such a test. Specifically, to become a Certified Linux Practitioner or a Certified Linux Engineer the person has worked in exam conditions on a real instance of the operating system (i.e. SuSe Linux releases 9 and 10) on a real network, doing things like install and configure devices, grant user rights and so forth. (Indeed, if they have the older (now superseded) CDE, Certified Directory Engineer, which ran from 2000-2004, they can claim the same thing, as in this forerunner of the practicum they had to be able to link to a real directory to do their stuff.)

How does all that differ from the Microsoft way of doing things? For King and others who back this philosophy of what a skills test should be, the point is it is not a simulated environment. The test means that for up to three hours a prospective network manager faces a computer and is told he has that time to meet three objectives, and he can only pass if he manages so to do.

'There has been a lot of simulation and neutral environment testing coming into exams, which is fine,' said King. 'But simulation is trying to recreate the environment; the practicum is the environment. Here you really are actually managing the servers.'

'A lot of firms are simulation based approaches,' adds Andrew Mallett, principal technologist with QA-IQ, 'but are they just testing what you can do with a standard test engine? It seems to me the value of a practicum style test is that it's not just hands-on, it tests not how you do it but that you can do it at all.

'If all the answers to a test exist in a book somewhere, then if you are the sort who can page in information quickly you can pass that test. In contrast, doing tasks on a machine to completion means you really can do those tasks.'

To be fair, Novell is obviously promoting all this to increase the allure of its qualification portfolio. King said: 'We are getting great feedback from something not so easy to pass as other exams, and employers love this.'

There can be no denying something of a gauntlet has been thrown down to the rest of the certification industry here - and not just to the ones who have brought in simulation but those bodies whose exams are still tick-box exercises. In that sense, QA-IQ's Mallett is right when he says the practicum idea is a 'challenge' to the whole training community.

Who will take up the challenge? If enough candidates and HR departments say they need such qualifications, we will have our answer.

Proves that I have the skill

One IT professional at least feels that the practicum approach to information technology skills acquisition has value. IT Training spoke to Andy Fox, an experienced consultant who provides technical advice to, as he puts it, 'small businesses that require IT solutions and support at a budget'. He told us:

‘The practicum exam - unlike standard multiple guess exams - proves that I have the technical skills to configure SuSE Linux services in a real world environment.

'We have all heard the term "paper MCSE or CNE", where an individual has attended a one to two week course designed at getting them through the exams, and understand that there is little to no value to those certifications as a result. Those of us who have been in the IT industry for some time also know about the numerous 'brain dump' sites that exist, where people who have taken such exams share information about the questions with others, a process that devalues the certification process.

'The practicum proves that the individual can actually configure SuSE Linux and [so] have the skills required.'

Fox holds both the CLP and CLE qualifications, i.e. two certificates you can only get by passing a practical exam.

All qualifications need to be dovetailed into experience

When it comes to evaluating the real merit of a qualification a job candidate presents, it must be said that however many real-world elements the exam can incorporate, it's still only one part of the picture regarding their ability.
 
This is the position at least of Geoff Chapman, EMEA head of communications at Prometric, a company that specialises in the delivery of many of the exams, related to skills around technologies like Novell and Microsoft, that are at the heart of the debate around the right mix of academic-practical.
 
'An exam is only as good as the curriculum it is intended to validate,' he points out. 'And from an employer's perspective, certification is only ever a small part of the overall story. The reliability of certification will always have its place, and the testing of core competency will always be important. But it is also true that the market is moving towards a more holistic assessment of both knowledge and abilities: it is no longer sufficient just to test the individual’s memory and recall.’
 
Chapman sees this as an 'encouraging' development, and praises Novell as a being 'a real pioneer' in this regard - but it is not the only technology company getting it. 'The best IT companies, you can see, are always trying to develop their qualification portfolios and in some cases are setting the bar very high - I'd point to all the work Microsoft is putting in to making its new IT Architect qualification as being particularly interesting here.'
 
The message is it all has to be about balance: 'Employers want to see solid evidence of experience so any work done getting certification and all qualifications work itself has to be dovetailed into that. Organisations must ensure that their training courses and exams are an accurate reflection of the skills needed now and in future. IT in particular is a fast-moving industry, where different skills sets constantly need to be refreshed.

'In any case, we need more than technical skills. Employers are now looking for a greater variety of skills from workers, not just technology-specific qualifications. For example, skills such as project management are now recognised as being crucial, particularly where big IT projects risk going over budget or over schedule. The integration of practical elements into assessment gives a fair representation of skills and abilities, but dirty hands matter too.'

November 2007