We encourage our clients to take a holistic view of security. In our minds the security of an organisation or a system is encompassed in all aspects of that organisation. The management, the physical and the logical environments all combine to influence the security culture that exists.

In taking such an overall approach testing inevitably forms part of the overall security since without it vulnerabilities are not discovered and remain on the live system.

The approach of many people to security is similar to their approach to testing. The first reaction is that it is something faintly disreputable and while they may agree something needs to be done the less they have to do with it the happier they are.

The second reaction is that it costs far too much; they would really like to have it all but they can only afford a little bit. They then look to the professionals to recommend which bit they should take.

Another reaction is to try to identify what they will get out of it. They never feel that they will get enough to justify the amount of time and effort it will cost. The only time people are really interested is when a failure occurs; at that point they want the whole thing done properly, on a shoestring and by the board meeting on Monday.

As I write this at the end of 2000 and look back over the past year, the overriding desire in business has been to put an "e" before everything. In doing this the speed of IT developments has had to be accelerated to gain the competitive advantage.

Unfortunately, in our haste to meet the deadlines we have dropped some of the best practices that have been painfully learnt over the years. This has resulted in many headline cases of security breaches and a doubt in the consumers' mind about the security of all internet environments.

In this article I will continue the emphases on all things "e" but would ask you, reader, to take the lessons we draw from this and apply them to your projects be they e-commerce or more traditional.

I would ask you, the testing community, to cast a critical eye over the systems you are testing. All too often the flaws in the security have been built in at the requirements and design stages and so prove very difficult to eradicate in the days or hours before implementation.

A typical design flaw leaves sensitive data on the web server – there are very few reasons for keeping this information there. If a web server has had to gather sensitive information then it should be moved to a more secure system, preferably behind a firewall.

If it is needed again, for instance for order tracking, then the information can be passed back; but only the information that is actually needed. Do not be afraid to challenge the assumptions that have been made in development. It is this flaw more than any other that hits the headlines because frequently it is credit card details and personal addresses that are revealed.

This type of breach is being widely reported in the national press at the moment and even as this is being written at Christmas 2000 there are reports of up to three million credit card numbers having been stolen from various servers.

Each of these numbers has a monetary value in the criminal world but frequently the designer does not take this into account and looks at the simpler problem of completing the transaction for the system owner.

Other information that is frequently revealed is complete or partial customer lists which are seen as having little monetary value but businesses regularly buy such lists to carry out marketing activities - this is your junk mail.

The basic solution to protecting this information is simple. It should not be kept on the web server or anywhere the public has access to it. When this type of information is required it should be called for and stored only long enough for the purpose it is required. Then in order to gain access to this information the thief has to be able to make repeated calls from or while spoofing the web server.

Classify the data and keep sensitive data away from the web server

Makes the task harder and, as anyone who has secured his or her home knows, the real aim is to be a harder target than someone else is. There will of course be occasions when data is required to be kept on the web server and these times should be kept to a minimum. Even then separate classes of information should be kept on different servers.

As an example of this, Bull Information Systems maintains an Intranet web site and allows selected customers access to subsets of the information held there. In August 2000 an unfortunate "human error" allowed external web users access to all of the information; the Kiteoa site that actively looks for breaches of this kind discovered the breach.

Many breaches are blamed on recent upgrades to "the system" and by its very nature, e-systems will require frequent upgrades and changes to maintain their place in the market and to keep customers revisiting them.

If we cannot freeze these systems we must ensure that the upgrades and changes we deploy are robust and, you will not be surprised to hear that this can only be done through rigorous testing.

A classic example of the need for this is the breach that Egg suffered late in 1999 when a customer was able to access another customers details. From what Egg has said this breach occurred due to human error while carrying out a routine update to the system.

The human aspect is a common factor in security breaches, which will come as no surprise to anyone involved in IT systems. For this reason it is important to test the full system including processes to ensure they provide a coherent method of managing the system and that the processes cannot be breached without it being noticed.

Carry out regression testing on the full system as part of each upgrade

Another frequent type of security breach is that caused by unexpected volumes of users. This year there have been several incidents of well publicised launches that have resulted in embarrassment when the service has proved unable to cope with demand and fails in the first few hours.

High profile marketing usually preceded the launch and the business wanted to be seen to attract large numbers of new customers. This was done very effectively - so effectively that the site went down and was not stable for two weeks after the launch.

From this we learn the old lesson that we need to know the capacity for our system, and how it will react if stressed. Tools are now available that will let the sites be tested, and this should be an essential part of any system test. The follow on from this is to look at the processes that deal with an overload and test them for robustness.

Systems should be stress tested before deployment and during significant upgrades. Processes should be in place to cope with unexpected demand.

However despite the above breaches I would venture to say that most security breaches on the Internet use old and well-known holes in the Operating System or the server software. Despite the great effort many organisations put into the initial securing of the operating system these holes continue to reappear.

This will normally occur during routine upgrades to elements of the software which are not checked to ensure that the secure environment has not been maintained. The regression testing and roll out plan must ensure that the environment has not been compromised either manually or using automated tools.

Check for known vulnerabilities

There is also an assumption that software that is widely used and tested must be secure. A mortgage broker found this not to be true when an external auditor found a security hole in the third party software they used which revealed loan applications that had been filed.

The auditor carried out a check and found 27 other financial institutions using this software. This undoubtedly proves the worth of external testers/auditors and reinforces the British Standard recommendation to use them (BS7799-1 1999 section 4.1.7). It turned out that a malicious employee at the software developers had put in this vulnerability.

The final area I would like to emphasise is the human one. During this year there have been numerous incidents when employees have revealed sensitive information to the wrong people. These have varied from a help desk accidentally e-mailing credit card lists to customers, to individuals reading information over the phone to unconfirmed people.

You may point out quite rightly that there is little that the tester can do to stop this but during testing you can assess screen layouts and warnings to see if they are appropriate and gain an impression of how sensitive information is treated. These may be subjective tests but can be as valuable as objective ones.

In this short paper it has not been possible to cover all aspects of security testing and for further advice please contact us. We hope that this has given you an insight into some areas that need more depth in testing.

In the testing world you are constantly being squeezed on time and budget to pass modules and systems. Your ideal of enough time may never be achievable but by pointing out the impact on business reputation and the credibility of the organisation you can justify testing based on the need for security.

I would recommend making friends with the security department and developing a common strategy because it may be your account that is exposed.

Points of view

  • Designed for security?
  • Keep sensitive information away from the server?
  • Encrypt before Storage
  • Automate Checking and audit
  • Look for failure modes
  • Think simple
  • Regression test
  • Check sample and generated code

Steven Cox, Stoneyfore Consulting