Regression in Testing


Regression in Testing : Effective Test Case Strategies


When it comes to regression in testing selecting test cases for automation is a critical activity. It’s important to follow certain guidelines to ensure their effectiveness and efficiency. In this article, we will explore the key points and expand on each one with detailed examples. Let’s dive in:





Clearly Define Expected Results:

The expected results in the design steps of a test case are crucial and must be clearly defined. They should be specific and not vague, providing enough explanation about what needs to be validated on the current display in the application. This helps in creating precise automation scripts. For example:

Step 1: Enter a valid username and password.
Expected Result: The user should be successfully logged in and redirected to the home page.


Include Expected Results for Each Step:

Each step in the test case should have an expected result. This ensures that every action or input is associated with a specific outcome. By clearly stating the expected result, it becomes easier to validate the application’s state after executing the inputs. For instance:

Step 2: Click on the “Add to Cart” button.
Expected Result: The selected item should be added to the shopping cart.


Provide at Least One Action or Input per Step:

Each step in the test case should include at least one action or input. While it’s acceptable to have multiple actions within one step, there should be only one expected result after all the inputs are completed. This keeps the test case focused and avoids confusion. For example:

Step 3: Select the desired category from the dropdown menu, enter the search keyword, and click the “Search” button.
Expected Result: The search results page should display relevant products based on the entered keyword.


Use Concise Button Descriptions:

When mentioning buttons, it’s recommended to type the button name within single quotes instead of stating “Press the update button.” This helps avoid unnecessary wordiness while still conveying the action clearly. For instance:

Step 4: Type ‘Update’ in the ‘Comments’ field and save the changes.
Expected Result: The updated comments should be displayed on the screen.


Avoid Repetition in Validating Fields:

To optimize test cases, avoid validating every field in each step. Instead, focus on listing only the fields that are important to validate in each specific step. Consider if it’s necessary to revalidate a field that was already validated in a previous step when the application returns to the same window. This reduces redundancy and streamlines the test case. For example:

Step 5: Verify that the ‘Total Price’ field reflects the correct amount after applying the discount code.
Expected Result: The ‘Total Price’ field should display the discounted amount.


Emphasize What Happens Instead of What Doesn’t:

When stating expected results, it is generally better to focus on what happens rather than what does not happen. It is easier to validate what is displayed or occurs, rather than what is not displayed or does not occur. This clarity helps in creating effective automation scripts. For example:

Step 6: Click on the ‘Submit’ button without entering any mandatory fields.
Expected Result: An error message should be displayed indicating the required fields.


Ensure Well-Written Test Case Descriptions:

The description of each test case should be well-written and include detailed information about the testing objectives and any necessary maintenance options or test data. This helps testers understand the purpose of the test case and ensures its accuracy. For instance:

Test Case: RA_TestName

Description: Verify the functionality of the user registration process by entering valid user details and completing the registration form. Ensure that the user receives a confirmation email and can successfully log in with the created credentials.

Keep Test Cases Short with Limited Objectives:

To maintain clarity and focus, it is important to keep test cases short and have limited testing objectives. Test cases with over 25 steps can become complex and difficult to manage. It is recommended to split them into multiple test cases if needed. Ideally, a test case should have fewer than 25 steps, while 30-40 steps can still be manageable.


Having more than 40 steps in a test case can be overwhelming for automation. If you find yourself exceeding this limit, reassess the steps and consider dividing them into multiple test cases. For example:


Test Case: RA_TestName_Part1

Objective: Validate user registration with valid data.

  1. Set up the application with default configuration.
  2. Enter valid user details in the registration form.
  3. Click on the ‘Register’ button.
  4. Verify successful user registration.

Test Case: RA_TestName_Part2

Objective: Validate user registration with invalid data.

  1. Set up the application with default configuration.
  2. Enter invalid user details in the registration form.
  3. Click on the ‘Register’ button.
  4. Verify appropriate error message is displayed.


Ensure Single Setup Step in Test Cases:

Ideally, a test case should have only one setup step. If there are multiple setup steps required, it is recommended to split the test case into two or more separate test cases. This helps in maintaining a clear test case structure and avoids complicating the test flow. Each test case should focus on a specific scenario or functionality. For example:

Test Case: RA_TestName

Objective: Validate login functionality.

  1. Set up the application with default configuration.
  2. Enter valid username and password.
  3. Click on the ‘Login’ button.
  4. Verify successful login and redirection to the dashboard.


Avoid Making Changes in Settings or Maintenance Dictionaries:


When designing test cases, ensure that they do not make changes to application settings or maintenance dictionaries. Modifying such settings during the execution of tests can lead to a corrupted test environment. If there is a need to test specific configurations, consider alternative approaches that don’t affect the overall test environment.


Map Testing Objectives to Design Steps:

To establish a clear connection between testing objectives and design steps, it’s crucial to map them together. Include the test objective number in the steps that prove the objective. This provides traceability and helps in understanding the purpose of each step. For instance:


  • Objective 1: Validate user registration process.
  • Step 2: Enter valid user details in the registration form.
  • Expected Result: The user should be successfully registered and can log in with the created credentials.


Separate Test Cases for Report Testing:

Reports often require manual review and validation, making them unsuitable for full automation. It is recommended to have separate test cases specifically designed for testing reports. Print out master copies of the reports and perform a visual comparison with the current regression run. This ensures accurate verification of report outputs.



Timely Execution of Verification Steps:

Verification steps in a test case should be executed within a reasonable timeframe. If a verification step requires waiting for more than a day, it is advisable to move it to another test case with a defined objective and execute it manually. Automation scripts should focus on immediate verifications.


Repeatability and Same-Day Execution:

All test cases written for automation must start and complete on the same day, unless there is an alternate way to set up specific test conditions programmatically and verify them immediately. This ensures consistent and repeatable test results.




By implementing these comprehensive guidelines, you can significantly improve the quality and efficiency of regression test cases for automation. Ensure that your objectives are clearly defined, your test cases are concise yet comprehensive, and your expected results are well-defined. These practices will help maximize the effectiveness of your regression testing efforts and drive efficient software development processes.