Tuesday, October 22, 2013

Test Plan- Software/ firmware testing building blocks

We, the Test Engineers, are already familiar with the formal things about the Testing Building Blocks ( terminologies)– Test Plan, Test Cases documents etc. There are already existing different ideas/ thoughts about this Test Plan, Test Cases. Many big companies like Microsoft, IBM, Google - tries to provide guidelines about these documents. Even IEEE has a segment where it specifies some guidelines about the Test Plan document. All of them are good, no doubt. They tried to put some industry standard in software / firmware testing. So, I’m not going to give any new idea or standard here, would rather summaries/ consolidate all of them under one document that will cover both the Software and Firmware area. 

In the following section, I will be a writing about Test Plan (TP) Template  that can be used as a generic template. This will be based on the Guideline provided by ANSI/IEEE standard 829-1983/ 829-1998- IEEE standard for Software Test Documentation. I have modified it in different places and re-organized it in a little bit differed way. All credit goes to the those who have written this steps as mentioned at the end of the blog. 

What is TEST PLAN?

As defined in IEEE 829-1998, "Test plan is a management planning document describing the scope, approach, resources, and schedule of intended testing activities. Test plan identifies test items, the features to be tested, the testing tasks, who will do each task, and any risks requiring contingency planning."

GENERIC TEST PLAN OUtline / Template- AT A GLANCE


It may vary with based on nature of the Product/ Business/ Budget.

1. Test Plan Identifier
2. Introduction
3. References Documents
4. Objective
             a. Business Objectives
             b. Test Objectives
             c. Quality Objectives
5. Scope

6. Test Items

7. Features to Be Tested

8. Features Not to Be Tested
9. Product’s Testing Strategy / Approach
               a. Unit Testing
               b. Component Testing
               c. Black Box Testing 
               d. Integration Testing
               e. Conversion Testing
               f. System Testing
               g. User Interface Testing
               h. Security Testing
               i. Recovery Testing
               j. Globalization Testing
               k. Performance Testing
               l. Regression Testing
               m. Load Testing
               n. User Scenario Testing
               o. User Acceptance Testing
               p. Beta Testing
10. Pass/ Fail Criteria
11. Testing Process 
             a. Test Deliverables
             b. Testing Tasks
12. Test Management
             a. Individual roles and responsibilities
             b. Schedules 
             c. Staffing and Training 
             d. Risk Assumptions and Constraints
13. Environmental Requirement 
14. Control Procedure
15. Approvals


GENERIC TEST PLAN OUTLINE / TEMPLATE- DESCRIPTION

Detail descriptions of each section as follows:

1. Test Plan Identifier  – A unique identifier that can be based on product or it can be any number that can identify the document uniquely.

2. Introduction – This section provides an overview/ history of the project. Briefly describes the items and features to be tested.

3. References Documents – Provide the references of different documents that are related, such as, Project Authorization, Project Plan, QA plan, Configuration Management Plan etc.

4. Objective – 
Business Objectives 
Specify the business objectives for the release. What aspect/ features are more important business wise. It will be given the highest priority and more extensive effort to it. 

Test Objectives  
Identify the success criteria for the project that can be measured and reported. You can define the goals for the planned testing effort. For example, an objective might be to track successes, failures, defect status, and issues in order to provide feedback to development before software is delivered to customers.3

Quality Objectives 
Review lists in table format the overall quality goals for a release, as well as the required entry and exit criteria for testing. Quality objectives are defined at the project level and implemented in individual test plans, where you can track whether each objective has been met. Typically, quality objectives provide various measurements of quality for the overall release, for example, the number or percentage of high severity defects that are allowed or the number of failed execution records that are permitted.3

5. ScopeSpecify the scope of the Test Plan. Describe specifically what the testing should accomplish, what to test and what not to test. For example, it can be limited to test on three major Operating System and not to worry about other OS. 

6. Test Items – Specifies the things that are to be tested within the scope of the test plan- different functions of the software. Also provide the references to the required documents - Requirement doc, Design doc, Architectural doc etc. 

7. Features to Be Tested – Mention all the features and combination of features/ functions that to be tested.

8.Features Not to Be Tested – Mention all features and specific combinations of features that will not be tested along with the reasons. 

9. Product’s Testing Strategy / Approach – Following Testing methods will vary to company to company. Usually I do the following testing in different phases so that I do not take to much time to make the software/ firmware perfect which can create big risk for company to loose the whole product in a competitive market. 

PHASE - I

Unit Testing 
Unit testing is testing directly at the most granular level. If given a method that takes two values and return a positive result. Does the method fails (crashes, throws an exception, etc) if either of the values is null or invalid? Does it return valid results given a specific set of values? Does it fail if given an incorrect set of values?

Component Testing 
Similar to unit testing but with a higher level of integration. The big difference here is that the testing is done in the context of the application instead of just directly testing the method in question. The purpose of component testing is to ensure that the program logic is complete and correct and ensuring that the component works as designed.2

Black Box Testing 
Black box testing assumes the code to be a black box that responds to input stimuli. The testing focuses on the output to various types of stimuli in the targeted deployment environments. It focuses on validation tests, boundary conditions, destructive testing, reproducible tests, performance tests, globalization, and security-related testing.

Integration Testing 
Testing conducted in which software elements, hardware elements, or both are combined and tested until the entire system has been integrated. The purpose of integration testing is to ensure that design objectives are met and ensures that the software, as a complete entity, complies with operational requirements. Integration testing is also called System Testing. 4

Conversion Testing 
Testing performed to make sure if there is  any  old/ legacy system exists then, data is converted from old to new are properly done and does not break the integrity of the data on the new system. 

System Testing 
Testing performed to confirm that software and/or hardware testing conducted on a complete, integrated system to evaluate the system's compliance with its specified requirements. Testing to ensure that the application operates in the production environment.

User Interface Testing 
Testing done to ensure that the application operates efficiently and effectively outside the application boundary with all interface systems.4

Security Testing 
Testing done to ensure that the application systems control and audit-ability features of the application are functional.4

Recovery Testing 
Testing done to ensure that application restart and backup and recovery facilities operate as designed.4

Globalization Testing 
Execute test cases to ensure that the application block can be integrated with applications targeted toward locales other than the default locale used for development.2


PHASE - II

Performance Testing 
Testing done to ensure that the application performs to customer expectations response time, availability, portability, and scalability. 4

Regression Testing 
Testing done to ensure that the applied changes to the application have not adversely affected previously tested functionality.4

Load Testing 
Load test the application block to analyze the behavior at various load levels. This ensures that it meets all performance objectives that are stated as requirements.2


PHASE - III

User Scenario Testing 
Testing done to ensure that all the possible scenarios that can be performed by users. Think out of the box scenarios. Think as users to create generate scenarios that user can do. It could be positive and/or negative testing. Go through all the mouse clicks and keyboard presses that the user may go through to get an action done (including logical and illogical steps). Aim for the “1% of people will do it“ scenarios. (I wrote a blog about it last month. You may find it interesting here). 

User Acceptance Testing
Testing conducted to determine whether or not a system satisfies the acceptance criteria and to enable the customer to determine whether or not to accept the system. Acceptance testing ensures that customer requirements' objectives are met and that all components are correctly included in a customer package.2

Beta Testing
Testing, done by the customer, using a pre-release version of the product to verify and validate that the system meets business functional requirements. The purpose of beta testing is to detect application faults, failures, and defects.4

10. Pass/ Fail Criteria – Specify the criteria to be used to determine whether each item has passed or failed testing.4

Suspension Criteria
Specify the criteria used to suspend all or a portion of the testing activity on test items associated with the plan.

Resumption Criteria
Specify the conditions that need to be met to resume testing activities after suspension. Specify the test items that must be repeated when testing is resumed.

Approval Criteria
Specify the conditions that need to be met to approve test results. Define the formal testing approval process.

11. Testing Process – Identify the methods and criteria used in performing test activities. Define the specific methods and procedures for each type of test. Define the detailed criteria for evaluating test results.

Test Deliverable
Identify the deliverable documents from the test process. Test input and output data should be identified as deliverable  Testing report logs, test incident reports, test summary reports, and metrics' reports must be considered testing deliverable4

Testing Tasks
Identify the set of tasks necessary to prepare for and perform testing activities. Identify all inter task dependencies and any specific skills required.4

12. Test Management – 

Individual roles and responsibilities
Identify the groups responsible for managing, designing, preparing, executing, witnessing, checking, and resolving test activities. These groups may include the developers, testers, operations staff, technical support staff, data administration staff, and the user staff.4

Schedule
Identify the high level schedule for each testing task. Establish specific milestones for initiating and completing each type of test activity, for the development of a comprehensive plan, for the receipt of each test input, and for the delivery of test output. Estimate the time required to do each test activity. When planning and scheduling testing activities, it must be recognized that the testing process is iterative based on the testing task dependencies.4

Staffing and Training
Identify the resources allocated for the performance of testing tasks. Identify the organizational elements or individuals responsible for performing testing activities. Assign specific responsibilities. Specify resources by category. If automated tools are to be used in testing, specify the source of the tools, availability, and the usage requirements.4

Risks and Assumptions
Risk analysis should be done to estimate the amount and the level of testing that needs to be done. Risk analysis gives the necessary criteria about when to stop the testing process. Risk analysis prioritizes the test cases. It takes into account the impact of the errors and the probability of occurrence of the errors.2

13. Environmental Requirement  – Specify both the necessary and desired properties of the test environment.

Hardware Identify the computer accessories/ physical device(s)/ related hardware(s) and network requirements needed to complete test activities.4

Software
Identify the software requirements needed to complete testing activities.  4

Security
Identify the testing environment security and asset protection requirements.4

Tools
Identify the special software tools, techniques, and methodologies employed in the testing efforts. The purpose and use of each tool shall be described. Plans for the acquisition, training, support, and qualification for each tool or technique. It could be different automation tools, could be tools for performance, load balancing testing.4

14. Control Procedure- 5

Problem Reporting
Document the procedures to follow when an incident is encountered during the testing process. If a standard bug reporting process is already there that mention the Product/Project  Name where all the bugs will be reported. 

Change Requests
Document the process of modifications to the software. Identify who will sign off on the changes and what would be the criteria for including the changes to the current product. 

Dependencies
For any change request, if it affects existing programs, then these modules need to be identified first. 

15. Approvals – Identify the plan approvers. List the name, signature and date of plan approval.


Wow! It became a long template. I know its so many things we need to record. We will do it once  or in-frequent times. Its part of the painful documentation process for a Test Engineer. But it is a vital part of the Testing Process. 


Sources/ Credits:
5. Test plan sample: SoftwareTesting and Quality assurance Templates 
6. Medical Device Software- Verification, Validation and Compliance - by David A. Vogel.  

Tuesday, October 15, 2013

Story- Identifying mystery bug

QA life experience ... 

Software testing, yeah I know, such a painful tedious job. Testing the some/same thing again and again. I was given a task once to identify the mystery bug that was driving the arm of a device in auto rotating mode. Obviously I cannot write anything detailed about it. It was one of my interesting bug identifying incidents that I’ll remember for long time. This issue was given to me with a high hope that I’ll be able to find the issue and I was able to find it (without knowing any details of the code/ logic).


In simple words of mine, when any function does not work as it should work or d
Lets start describing the problem first. Customer Service department was receiving calls about the issue that was driving the arm of a device. Anyway, it was happening out there in the field randomly with different heavy weight metal attachments on the arm. It was very confusing, at least what was recorded in text. There were several attachments that are possible to attach on the arm. Each attachment will then set to a certain Range of Motion (ROM). ROM can describe as a Circular path. It can be a 360 degree of rotation. Once the ROM is set, then user will have to verify the ROM by moving the arm within the specified ROM. After that when the test starts from a starting point of the ROM, the attachment/arm will be moving within the ROM specified. It was supposed to move when someone move the arm. In other words, when some force applied on the arm, it starts to move. In short this was the way of operation for this device.

When I was told to find the issue, obviously I looked at the text of the complaint from the customers. There were few of them. Those were recorded in different ways. My first attempt was to try as per description. To understand more about the customer complain, I actually talked with the customer service department. They gave me few details that I felt was important but not recorded in the report.  (Always dig for more information. You never know what/when/ who will give you the CLUE).

Next few days, I tried to perform test according to the description/input from customer/service dept. tried to duplicate the issue. Could not figure it out what is going on. When something like this happens, I usually take a break from testing and go back to the drawing board of testing plan. This helps me to calm down and rethink about testing approaches. I felt that I did not have full grip of the issue. I didn’t understand how the system should work for each attachment. I went to talk with some people who knew more about these products. They were explaining in more details how things should work.

Now, once I know more of the issue and more detailed information about the product, I started to layout out my testing approach differently. When the test starts, it was supposed to start from the starting position- which is either of the ROM end limits. This was most important part I missed during my previous tries, which made me a little bit frustrated as well. I was following what either it was reported or told by the knowledgeable person of the product. As a tester, this is why I always try to think out of the box, out of the conventional method. Anyway, when I tested the thing from either end, it was okay. When I performed it from the Center of the ROM limits, it even worked!!! I was going to be crazy. Then I thought about applying force on it at the time of start and then from the center position of the ROM. Yahoooo! I got it, I did it. I was able to duplicate the issue where it starts moving the arm automatically.

It did it, when arm was in the center of the ROM and some force applied to the arm to pass through the middle point. Once it misses the middle point of the ROM, it keeps trying to find the middle point and moves on for indefinite time.

So, Why? Why I applied force? Just think it logically, or I should say practically. As I described earlier, attachments are attached to the arm and then during testing it moves. When the attachments are attached, it was supposed to stand still. Not to move one inch. Then the command sent to the controller to release the arm and arm moves. Attachments have their own weight. Let’s say I positioned the arm right before the center of the ROM. Then when I put those metal attachments, it moved just a little to pass the center point of the ROM and then it continuously move on. Some logic was used to identify the middle of the ROM after the arm started to move for some reason.

This was a great relief for the company. We were able to find the issue and fix it. The fix took near about 30 mins to an hour max. Without knowing anything about the code logic, or any specific details, by thinking about the issue logically, I was able to find it out.

I have learned from this that it's pretty important to have in depth knowledge, good grip of the product and its practical use. 

Have a wonderful day. 

Communication - it's very important in recruiting people

One of the common part of our professional life is we get mails from recruiters time to time regardless whether you are looking for job o...