I started my career in testing via a round-about route. I started life as something developers hate even more than testers, an end-user with a basic technical knowledge. I would happily while away the hours, tinkering with applications and finding "bugs", and gleefully reporting them.

After some time I was offered a secondment into something to do with IT, stupidly I accepted before asking exactly what I was going to do. I wasn't too concerned, I thought that a route into IT would lead on to a decent job, whether it was development or support, I didn't really mind.

Much to my surprise, I was asked to help with the user acceptance testing on a new project. At that time I knew that software needed to be tested or at least broken, but had no idea that testing was a career.

Fortunately, on hand were some experienced and highly paid consultant testers to show me the ropes. After working on both the attempts to get this solution in place, I thought that I had gained enough experience to join the testing team, how wrong was I!

The issue was that I was working with a business area I knew and had worked in for the last two years, and was therefore in the comfort zone. I was enjoying what I was doing.

I was using this opportunity to put right what had once gone wrong, and although there were no functional specifications to speak of, I knew the business; I knew the user base; I thought I knew everything and that this testing lark was a doddle. It was only when I joined the testing team in the IT development pool did I realise what I had let myself in for.

At Thomas Cook we have a development pool made up of the main IT disciplines, Project Management, Analysis, Development and Testing, and it was into the testing stream that I applied for, and got, a job.

This set-up differs greatly from the usual software house routine of building and maintaining a small number of systems. I thought that with my knowledge of the system I had used and then tested, I would be secure in what I was doing.

Fortunately I was sent on a training course within a few weeks of joining the pool, and soon learnt that there was a lot more to testing than I first thought. This was a little daunting to begin with, but I took the bull by the horns and started to test.

The main issue I encountered and still do to this day is that on the new project I started, there were no functional specifications. This was fine when I knew the business area for which system was being built, but now I had problems.

So I started talking to the business users and trying to analyse their requirements. I now know that this is really a job for the analysts (if you have any), but I still see it as one of the essential skills of a tester, without which you may be able to deliver a system that is functionally sound, but doesn't fit the business needs.

After working on a couple of further projects, with their own unique obstacles, including an overseas implementation, and ever changing requirements, I was asked to help in the valuation of a performance testing tool. This is where not only mine, but the whole team's naivety became apparent.

At this point the team was made up of a small number of testers, some taken from the business, myself, and one or two more experienced testers brought in from other companies. None of us had ever made a serious attempt at performance testing before, and it took a lot of hard work from one of my colleagues to get the sign-off for the spend, which was in the region of £50k.

The problem we had was due to our naïve approach, we tried to get the tool to do too much, we were trying to combine functional tests in with our performance tests, and surprisingly the tool was not capable.

We spent a number of months and a considerable amount of time with the tool vendor's consultants before coming to the conclusion that it wasn't working and that we would have to go back to the old methods of performance testing, or pizza evenings.

I'm sure we are not the only company to have asked all of the users to come in with the promise of a free pizza, in return for the testers taking some manual timings after shouting "1-2-3 press enter now". I can tell you that I got a number of strange looks when buying the entire stock of stopwatches from our local Argos.

The next major step in my development included a visit to the Eurostar conference of 1998. I had only been in the testing team for six months, but was given the opportunity to attend the conference. I know it sounds dull; a hotel with 800 testers in it, but it was an invigorating and eye opening experience.

It was great to learn that everyone encountered the same sorts of issues that you had and there were people out there trying to make things better. This conference convinced me that testing is a career, and one that was only going to develop further.

We as a team persevered with testing tools, spurred on, not deterred by our previous failure. We had heeded all of the warnings about testing tools becoming shelf-ware, but we wanted to move forward.

We decided to tighten up our evaluation methodology, instead of picking the first tool we had seen that sort of suited our needs. We drew up a list of 25 requirements, prioritised them and invited tools vendors into our office to demonstrate. We did set our sights a little lower this time, and concentrated on test management tools, but we needed to evaluate our evaluation process.

Around the same time, the whole team were invited to take the foundation certificate in software testing from ISEB, and although the course only re-confirmed and gave names to the techniques we were using to test, it was an excellent way of standardising the approach to testing within the team.

I'm not advocating that you turn all testers into robots, otherwise you will lose that individuality and ingeniousness that I think a tester should bring to their job, just that we all need guidelines to work from. I could of course use this as an opportunity to discuss my own views of the company I work for.

Back to the tool evaluation. Part way through the evaluation of the test management tool, a whole string of new projects were announced, all internet based, and if we had learnt nothing else from the conferences and seminars, we knew that performance was going to be a key.

So we decided to bring in the major players, ask them to mark themselves against the evaluation criteria, and demonstrate the tools. After which we brought in the main two, and got them to prove their tool against one of our systems in a head to head test.

In the end the decision came down to a question of usability or functionality, knowing that the technical knowledge within the team was limited, we were swayed heavily by a user-friendly GUI. But remembering the lessons learnt before; we chose the more difficult option of having to learn the nuts and bolts of the tool.

This is where we had the opportunity to enhance our technical and development skills, and recording and customising performance scripts is something akin to programming.

In addition to the normal day to day role of a tester, which I will still claim is very varied, certainly given the various stages of the development life-cycle and the different projects and systems to work upon, there are other ways of becoming a more rounded tester.

It is difficult, because we all have ever rapidly approaching deadlines, but we must find time to read publications such as this, and there are many others, we must visit conferences and seminars and share our knowledge and experience. Also there are testing discussion groups and forums on the web, which can offer advice and help, the supply of information is endless, but we must learn to use it wisely.

As a tester, I find it very reassuring to know that there are other people out there, experiencing the same frustrations and emotions as I am. We must maintain the sense of community that the testing industry has built, and build further upon this to emphasise the importance of testing as we move into the rapidly changing world of Internet development.

So in conclusion what have we learnt (apart from an abridged version of my C.V.)? Well although I agree that anyone can indeed execute a test, it takes a lot more thought, tact, guile and diplomacy to become a successful and professional tester.

Not only that, but you need to borrow from the skill-set of the analyst and developer if you wish to be a fully rounded tester. Plus that slightly devious and destructive streak that we won’t admit to having, but all recognise, when we get that warm feeling after finding a major system fault. Or is that just me?

Owen Christy, Thomas Cook