Tuesday, October 22, 2013

Test Plan- Software/ firmware testing building blocks

We, the Test Engineers, are already familiar with the formal things about the Testing Building Blocks ( terminologies)– Test Plan, Test Cases documents etc. There are already existing different ideas/ thoughts about this Test Plan, Test Cases. Many big companies like Microsoft, IBM, Google - tries to provide guidelines about these documents. Even IEEE has a segment where it specifies some guidelines about the Test Plan document. All of them are good, no doubt. They tried to put some industry standard in software / firmware testing. So, I’m not going to give any new idea or standard here, would rather summaries/ consolidate all of them under one document that will cover both the Software and Firmware area. 

In the following section, I will be a writing about Test Plan (TP) Template  that can be used as a generic template. This will be based on the Guideline provided by ANSI/IEEE standard 829-1983/ 829-1998- IEEE standard for Software Test Documentation. I have modified it in different places and re-organized it in a little bit differed way. All credit goes to the those who have written this steps as mentioned at the end of the blog. 

What is TEST PLAN?

As defined in IEEE 829-1998, "Test plan is a management planning document describing the scope, approach, resources, and schedule of intended testing activities. Test plan identifies test items, the features to be tested, the testing tasks, who will do each task, and any risks requiring contingency planning."

GENERIC TEST PLAN OUtline / Template- AT A GLANCE


It may vary with based on nature of the Product/ Business/ Budget.

1. Test Plan Identifier
2. Introduction
3. References Documents
4. Objective
             a. Business Objectives
             b. Test Objectives
             c. Quality Objectives
5. Scope

6. Test Items

7. Features to Be Tested

8. Features Not to Be Tested
9. Product’s Testing Strategy / Approach
               a. Unit Testing
               b. Component Testing
               c. Black Box Testing 
               d. Integration Testing
               e. Conversion Testing
               f. System Testing
               g. User Interface Testing
               h. Security Testing
               i. Recovery Testing
               j. Globalization Testing
               k. Performance Testing
               l. Regression Testing
               m. Load Testing
               n. User Scenario Testing
               o. User Acceptance Testing
               p. Beta Testing
10. Pass/ Fail Criteria
11. Testing Process 
             a. Test Deliverables
             b. Testing Tasks
12. Test Management
             a. Individual roles and responsibilities
             b. Schedules 
             c. Staffing and Training 
             d. Risk Assumptions and Constraints
13. Environmental Requirement 
14. Control Procedure
15. Approvals


GENERIC TEST PLAN OUTLINE / TEMPLATE- DESCRIPTION

Detail descriptions of each section as follows:

1. Test Plan Identifier  – A unique identifier that can be based on product or it can be any number that can identify the document uniquely.

2. Introduction – This section provides an overview/ history of the project. Briefly describes the items and features to be tested.

3. References Documents – Provide the references of different documents that are related, such as, Project Authorization, Project Plan, QA plan, Configuration Management Plan etc.

4. Objective – 
Business Objectives 
Specify the business objectives for the release. What aspect/ features are more important business wise. It will be given the highest priority and more extensive effort to it. 

Test Objectives  
Identify the success criteria for the project that can be measured and reported. You can define the goals for the planned testing effort. For example, an objective might be to track successes, failures, defect status, and issues in order to provide feedback to development before software is delivered to customers.3

Quality Objectives 
Review lists in table format the overall quality goals for a release, as well as the required entry and exit criteria for testing. Quality objectives are defined at the project level and implemented in individual test plans, where you can track whether each objective has been met. Typically, quality objectives provide various measurements of quality for the overall release, for example, the number or percentage of high severity defects that are allowed or the number of failed execution records that are permitted.3

5. ScopeSpecify the scope of the Test Plan. Describe specifically what the testing should accomplish, what to test and what not to test. For example, it can be limited to test on three major Operating System and not to worry about other OS. 

6. Test Items – Specifies the things that are to be tested within the scope of the test plan- different functions of the software. Also provide the references to the required documents - Requirement doc, Design doc, Architectural doc etc. 

7. Features to Be Tested – Mention all the features and combination of features/ functions that to be tested.

8.Features Not to Be Tested – Mention all features and specific combinations of features that will not be tested along with the reasons. 

9. Product’s Testing Strategy / Approach – Following Testing methods will vary to company to company. Usually I do the following testing in different phases so that I do not take to much time to make the software/ firmware perfect which can create big risk for company to loose the whole product in a competitive market. 

PHASE - I

Unit Testing 
Unit testing is testing directly at the most granular level. If given a method that takes two values and return a positive result. Does the method fails (crashes, throws an exception, etc) if either of the values is null or invalid? Does it return valid results given a specific set of values? Does it fail if given an incorrect set of values?

Component Testing 
Similar to unit testing but with a higher level of integration. The big difference here is that the testing is done in the context of the application instead of just directly testing the method in question. The purpose of component testing is to ensure that the program logic is complete and correct and ensuring that the component works as designed.2

Black Box Testing 
Black box testing assumes the code to be a black box that responds to input stimuli. The testing focuses on the output to various types of stimuli in the targeted deployment environments. It focuses on validation tests, boundary conditions, destructive testing, reproducible tests, performance tests, globalization, and security-related testing.

Integration Testing 
Testing conducted in which software elements, hardware elements, or both are combined and tested until the entire system has been integrated. The purpose of integration testing is to ensure that design objectives are met and ensures that the software, as a complete entity, complies with operational requirements. Integration testing is also called System Testing. 4

Conversion Testing 
Testing performed to make sure if there is  any  old/ legacy system exists then, data is converted from old to new are properly done and does not break the integrity of the data on the new system. 

System Testing 
Testing performed to confirm that software and/or hardware testing conducted on a complete, integrated system to evaluate the system's compliance with its specified requirements. Testing to ensure that the application operates in the production environment.

User Interface Testing 
Testing done to ensure that the application operates efficiently and effectively outside the application boundary with all interface systems.4

Security Testing 
Testing done to ensure that the application systems control and audit-ability features of the application are functional.4

Recovery Testing 
Testing done to ensure that application restart and backup and recovery facilities operate as designed.4

Globalization Testing 
Execute test cases to ensure that the application block can be integrated with applications targeted toward locales other than the default locale used for development.2


PHASE - II

Performance Testing 
Testing done to ensure that the application performs to customer expectations response time, availability, portability, and scalability. 4

Regression Testing 
Testing done to ensure that the applied changes to the application have not adversely affected previously tested functionality.4

Load Testing 
Load test the application block to analyze the behavior at various load levels. This ensures that it meets all performance objectives that are stated as requirements.2


PHASE - III

User Scenario Testing 
Testing done to ensure that all the possible scenarios that can be performed by users. Think out of the box scenarios. Think as users to create generate scenarios that user can do. It could be positive and/or negative testing. Go through all the mouse clicks and keyboard presses that the user may go through to get an action done (including logical and illogical steps). Aim for the “1% of people will do it“ scenarios. (I wrote a blog about it last month. You may find it interesting here). 

User Acceptance Testing
Testing conducted to determine whether or not a system satisfies the acceptance criteria and to enable the customer to determine whether or not to accept the system. Acceptance testing ensures that customer requirements' objectives are met and that all components are correctly included in a customer package.2

Beta Testing
Testing, done by the customer, using a pre-release version of the product to verify and validate that the system meets business functional requirements. The purpose of beta testing is to detect application faults, failures, and defects.4

10. Pass/ Fail Criteria – Specify the criteria to be used to determine whether each item has passed or failed testing.4

Suspension Criteria
Specify the criteria used to suspend all or a portion of the testing activity on test items associated with the plan.

Resumption Criteria
Specify the conditions that need to be met to resume testing activities after suspension. Specify the test items that must be repeated when testing is resumed.

Approval Criteria
Specify the conditions that need to be met to approve test results. Define the formal testing approval process.

11. Testing Process – Identify the methods and criteria used in performing test activities. Define the specific methods and procedures for each type of test. Define the detailed criteria for evaluating test results.

Test Deliverable
Identify the deliverable documents from the test process. Test input and output data should be identified as deliverable  Testing report logs, test incident reports, test summary reports, and metrics' reports must be considered testing deliverable4

Testing Tasks
Identify the set of tasks necessary to prepare for and perform testing activities. Identify all inter task dependencies and any specific skills required.4

12. Test Management – 

Individual roles and responsibilities
Identify the groups responsible for managing, designing, preparing, executing, witnessing, checking, and resolving test activities. These groups may include the developers, testers, operations staff, technical support staff, data administration staff, and the user staff.4

Schedule
Identify the high level schedule for each testing task. Establish specific milestones for initiating and completing each type of test activity, for the development of a comprehensive plan, for the receipt of each test input, and for the delivery of test output. Estimate the time required to do each test activity. When planning and scheduling testing activities, it must be recognized that the testing process is iterative based on the testing task dependencies.4

Staffing and Training
Identify the resources allocated for the performance of testing tasks. Identify the organizational elements or individuals responsible for performing testing activities. Assign specific responsibilities. Specify resources by category. If automated tools are to be used in testing, specify the source of the tools, availability, and the usage requirements.4

Risks and Assumptions
Risk analysis should be done to estimate the amount and the level of testing that needs to be done. Risk analysis gives the necessary criteria about when to stop the testing process. Risk analysis prioritizes the test cases. It takes into account the impact of the errors and the probability of occurrence of the errors.2

13. Environmental Requirement  – Specify both the necessary and desired properties of the test environment.

Hardware Identify the computer accessories/ physical device(s)/ related hardware(s) and network requirements needed to complete test activities.4

Software
Identify the software requirements needed to complete testing activities.  4

Security
Identify the testing environment security and asset protection requirements.4

Tools
Identify the special software tools, techniques, and methodologies employed in the testing efforts. The purpose and use of each tool shall be described. Plans for the acquisition, training, support, and qualification for each tool or technique. It could be different automation tools, could be tools for performance, load balancing testing.4

14. Control Procedure- 5

Problem Reporting
Document the procedures to follow when an incident is encountered during the testing process. If a standard bug reporting process is already there that mention the Product/Project  Name where all the bugs will be reported. 

Change Requests
Document the process of modifications to the software. Identify who will sign off on the changes and what would be the criteria for including the changes to the current product. 

Dependencies
For any change request, if it affects existing programs, then these modules need to be identified first. 

15. Approvals – Identify the plan approvers. List the name, signature and date of plan approval.


Wow! It became a long template. I know its so many things we need to record. We will do it once  or in-frequent times. Its part of the painful documentation process for a Test Engineer. But it is a vital part of the Testing Process. 


Sources/ Credits:
5. Test plan sample: SoftwareTesting and Quality assurance Templates 
6. Medical Device Software- Verification, Validation and Compliance - by David A. Vogel.  

No comments:

Communication - it's very important in recruiting people

One of the common part of our professional life is we get mails from recruiters time to time regardless whether you are looking for job o...