200511316 김형석
Test plan Test design specification Test case specification Test procedure specification Test item transmittal report Test log Test incident report Test summary report
Purpose -The purpose p of the test plan is to prescribe the scope, approach, resources, and schedule of the testing activities. To identify the items being tested, the features to be tested, the testing tasks to be performed, the personnel responsible for each task, and the risks associated with this plan.
Test plan identifier - Specify the unique identifier assigned to this test plan. Introduction - Summarize the software items and software features to be tested.
Introduction - References to the following documents are required in the highest level test plan: a) Project authorization at o b) Project plan c) Quality assurance plan d) Configuration management plan e) Relevant policies f) Relevant standards
Test items - Specify characteristics of their transmittal media that impact hardware requirements of indicate the need for logical of physical transformations before testing can begin. ex) programs must be transferred from tape to disk.
Test items - Supply references to the following test item documentation, if it exists: a) Requirements e e specificationcat b) Design specification c) Users guide d) Operations guide e) Installation guide
Features to be tested - Identify all software features and combinations of software features to be tested. Features not to be tested - Identify all features and significant combinations of features that will not be tested and the reasons.
Approach - Describe the overall approach to testing. - Specify any additional i completion criteria. i ex) error frequency Item pass/fail criteria - Specify the criteria to be used to determine whether each test t item has passed or failed testing. ti Suspension criteria & resumption requirements - Specify the criteria i used to suspend all or a portion of the testing activity on the test items associated with this plan.
Test deliverables - Identify the deliverable documents. - The following documents should be included. a) Test plan b) Test design specifications c) Test case specifications d) Test procedure specifications e) Test item transmittal reports f) Test logs g) Test incident reports h) Test summary reports
Testing tasks - Identify the set of tasks necessary to prepare p for and perform testing. Environmental needs - Specify both the necessary and desired properties of the test environment.
Environmental needs - Identify special test tools needed. - Identify any other testing needs (e.g., publications or office space).
Responsibilities - Identify the groups responsible for managing, g designing, preparing, executing, witnessing, checking, and resolving. Staffing and training needs - Specify test staffing needs by skill level. Identify training options for providing necessary skills.
Shd Schedule - Define any additional test milestones needed. Estimate the time required to do each testing task. Risks and contingencies - Identify the high-risk assumptions of the test plan. Specify contingency plans for each (e.g., delayed delivery of test items might require increased night shift scheduling to meet the delivery date). Approvals -Specify the names and titles of all persons who must approve this plan. Provide space for the signatures and dates.
Purpose - To specify refinements of the test approach and to identify the features to be tested by this design and its associated tests.
Test design specification identifier - Specify the unique identifier assigned to this test design specification. - Supply a reference e e to the associated ated test plan. Features to be tested - Identify the test items and describe the features and combinations of features that are the object of this design specification.
Approach refinements - Specify refinements to the approach described in the test t plan. - Include specific test techniques to be used. - The method of analyzing test results should be identified. - Specify the results of any analysis that provides a rationale for test case selection. ex) specify conditions that permit a determination of error tolerance
Approach refinements - Summarize the common attributes of any test cases. Test identification - List the identifier and a brief description of each test case associated with this design. - List the identifier and a brief description of each procedure associated with this test design specification.
Feature pass/fail criteria - Specify the criteria to be used to determine whether the feature combination has passed or failed.
Purpose - To define a test case identified by a test design specification. Test case specification identifier - Specify the unique identifier assigned to this test case specification.
Test items - Identify and briefly describe the items and features to be exercised by this test t case. - For each item, consider supplying references to the following test t item documentation: ti a) Requirements specification b) Design specification c) Users guide d) Operations guide e) Installation guide
Input specifications - Specify each input required to execute the test case and all required relationships between inputs. Output specifications - Specify all of the outputs and features.
Environmental needs a) Hardware b) Software - System SW such as operating systems, compilers, simulators, and test tools. Special procedural requirements - Describe any special constraints t on the test t procedures that execute this test case. Intercase dependencies - List the identifiers of test cases that must be executed prior to this test case.
Purpose - To specify the steps used to analyze a SW item in order to evaluate a set of features. Test procedure e specification cat identifier e - Specify the unique identifier assigned to this test procedure specification. Special requirements - Identify any special requirements that are necessary for the execution of this procedure.
Procedure steps a) Log b) Set up c) Start d) Proceed e) Measure f) Shut down g) Restart h) Stop i) Wrap up j) Contingencies
Purpose - To identify the test items being transmitted for testing. It includes the person responsible for each item, its physical location, and its status. Transmittal report identifier - Specify the unique identifier assigned to this test item transmittal report.
Transmitted items - Identify the test items being transmitted, including their version/revision level. Location - Identify the location of the transmitted items. Status - Describe the status t of the test t items being transmitted. Approvals - Specify the names and titles of all persons who must approve this transmittal.
Purpose - To provide a chronological record of relevant details about the execution of tests. Test log identifier - Specify the unique identifier assigned to this test log.
Description - The following information should be considered. a) For each items, supply a reference to its transmittal report, if it exist. b) Identify the attributes of the environments in which the testing is conducted. Activity and event entries - For each event, including the beginning g and end of activities, record the occurrence date and time along with the identity of the author.
Activity and event entries a) Execution description b) Procedure results c) Environmental information d) Anomalous events e) Incident report identifiers
Purpose - To document any event that occurs during the testing ti process that t requires investigation. Test incident report identifier - Specify the unique identifier assigned to this test incident report. Summary - Summarize the incident and identify the test items involved indicating their version/revision level.
Incident description - This description should include the following items: a) Inputs b) Expected results c) Actual results d) Anomalies e) Date and time f) Procedure step g) Environment h) Attempts to repeat i) Testers j) Observers
Purpose - To summarize the results of the designated testing activities and to provide evaluations based on these results. Test summary report identifier - Specify the unique identifier assigned to this test summary report. Summary - Summarize the evaluation of the test items.
Variances - Report any variances of the test items from their design specifications. Comprehensiveness assessment - Evaluate the comprehensiveness of the testing process against the comprehensiveness criteria specified in the test plan if the plan exists. Summary of results - Summarize the results of testing. Evaluation - Provide an overall evaluation of each test t item including its limitations. This shall be based on the test results and the item level pass/fail criteria.
Summary of activities - Summarize the major testing activities and events. Approvals - Specify the names and titles of all persons who must approve this report.
Consider documenting sets of modules at the moduletest level. While different test cases would be required, a common test procedure specification might be appropriate.
Activitie iti Documents s Types of tests Test plan Test design specification Test case specification Test procedure specification Test item transmittal report Test log Test incident report Test summary report Acceptance X X X X X X X Field X X X X X Installation X X X X X X X System X X X X X X X X Subsystem X X X X X X X Program X X X Module X X X