Testing the Changes

Software systems are continuously growing in functional complexity, and are increasingly becoming key to the core activities of most organisations.

The pressure to develop new or improve existing software systems is greater than ever, with software failures having an ever more serious potential impact on the profitability or even viability of organisations.

When focusing on the ongoing maintenance of software systems alone, identifying how to re-test an application once changes have been made is not an exact science. Some pundits say that the entire application should be re-tested even for a one-line change, whereas others are convinced that only "affected funstionality" should be tested.

The first step in improving the regression testing process is the ability to manage the changes made to the software.

Change Management typically encompasses the issues of logging faults, managing software versions and documenting enhancements, but a major additional issue is the determination of the nature and location of actual changes made to code.

Most Change Management systems will show the location of changes by highlighting the affected lines, but they do not allow a wider architectural view of the implications.

Analysis of the source code before and after modification will provide a valuable insight into the nature and impact of modifications. Structural diagrams that visualise the location of changes, combined with sensible determination of the complexity (and therefore risk) of modified code allow project managers and maintenance engineers to assess the nature of the change.

Metrics trending is also an important technique that can provide considerable benefits when managing ongoing development projects. The ability to pinpoint the nature of changes, evidenced by a change in software measurements is a powerful tool when assessing the impact and extent of the modifications.

Once the nature and location of the change has been identified, the next important consideration is the "testedness" of the modification. A minimum requirement of a regression test is that it has tested the changed code. Only after the modifications have been tested should other additional tests for unmodified code possibly impacted by the change be performed.

Given that the majority of test schedules finish "early" it is doubly important to ensure that the modified code is tested first.

Test schedules finish "early" due to many reasons, not least increasing commercial pressure to deliver early, testing boredom and a constant increase in the functional complexity of software systems. These pressures force the time scheduled for testing to be reduced in the majority of projects.

Another important factor in testing modified code is the ability to sensibly determine that a software system has been tested to a common standard between releases.

Given that the functional requirements of different releases of the same system will differ, as will the amount of code, the only sensible technique for maintaining a common standard is testing coverage.

Percentage figures detailing the total code covered during testing, and showing the differences between releases of a software system are the only concrete evidence of common testing quality.

Finally I would like to return to the initial problem posed in this article, namely should a modified application be entirely or only partially re-tested?

If partial re-testing is to be viable, then a technique for consolidating the coverage results from a previous version which did undergo full testing, and overlaying that coverage onto the code elements which were not changed is crucial.

When "added" to the coverage achieved during the regression test, the combined coverage should always be higher than that achieved during the previous full test. A reduced total coverage figure using this technique indicates that there is new functionality yet to be tested.

Many of these techniques have been created and pioneered by McCabe Associates, and are available in the newly released version 6.2 of the McCabe Visual Toolsets.

Matthew Brady, McCabe & Associates