Previous Table of Contents Next


UNIT TESTING

Testing occurs at many steps in the software development lifecycle. The purpose of testing is to identify program flaws. Software is typically broken down into numerous modules, functions or subroutines. The process of testing these individual portions of a program is termed unit testing. Other forms of testing occur throughout the development process. In addition to testing individual software modules, testing also takes place during module integration, at the box and system level, and during field trials. This chapter addresses the production of test data for testing at the software unit or module level. Performing the test reaffirms the module’s adherence to the requirements and helps uncover flaws. Test plans written for individual software modules are called unit test plans and consist of multiple calls passing input data to the module under test (MUT). For each set of input data, the test plan indicates what the module’s response should be. Unit test plans are typically generated manually by the developer when designing a new module.

Some of the more popular unit testing techniques are equivalence partitioning, structural testing, constraint based testing, and functional testing. Module testing without detailed knowledge of the source code is called black box testing. Black box testing proceeds with the tester being provided no more than a description of the module under test and allowed parameter values. White box (or structural testing) depends on the tester having source code available to help identify test data likely to draw out design flaws. Functional testing and equivalence partitioning are black box testing techniques. Structural and constraint based testing are white box testing techniques. The generation of test data is a fundamental problem of all testing techniques.

In general, any form of software testing tries to limit the number of test cases run. It is simply not economical (or even feasible) to run through all possible permutations of program control and data flow because the number of possible paths or threads of execution through a module can quickly become unmanageable. Structural testing attempts to execute a set of test cases which evaluates every line of code at least once, comparing the results to those expected.

The goal of the current effort is twofold: (1) to investigate the applicability of genetic algorithms to automate unit test data generation, and (2) to identify factors which improve the performance of the genetic algorithm in generating test data. This is the first step in designing a utility to automatically generate unit test data.

Much work remains in the development of this utility. A C language parser is needed to analyze and instrument the source code and then set up the genetic algorithm. The utility would be invoked by adding a new .ut rule to a development project’s Makefile. The Unix make utility would instruct the parser to analyze and instrument a copy of the indicated source code file. The parser would then tailor the genetic algorithm’s source code to the module under test. The instrumented copy of the module under test and the genetic algorithm’s source code would then be compiled and executed to produce a list of test data.

If the MUT included a description of its software requirements, delimited by a special token the parser was designed to detect, the requirements could be extracted and saved by the parser. A suitable test plan could then be constructed by concatenating the saved requirements and driver code calling the module under test with the newly produced test data. The tester is responsible for verifying that actual performance matches the listed requirements. As described, such a utility would produce unit test data on demand and provide the following benefits:

  Eliminate the need for the storage and maintenance of unit test plans
  Reduce cycle time by automating the production of test data
  Eliminate traceability problems when multiple versions of a source module exist


Previous Table of Contents Next

Copyright © CRC Press LLC