Test Automation at a Healthcare Company: My Journey

 

Test Automation

 

 

 

 

Hey there, fellow software testing professionals and enthusiasts! It’s a pleasure to share my experiences and the incredible test automation journey I had with a remarkable healthcare company during the mid-2000s. So, let’s embark on this conversational ride together, as I take you through the key learnings and triumphs that shaped my understanding of test automation in the healthcare domain.

 

Pioneering Test Automation Success: Laying the Foundation

This was in the year 2005, and test automation is already a hot topic in the software testing world. Our healthcare company was ahead of the curve, having established a dedicated automation group early on. Armed with tools like Windrunner (anyone remembers that?), QTP (now known as UFT), Test Director (which we now know as ALM), and a bunch of custom-built tools, we set out on our quest for test automation excellence.

 

At the onset, we targeted client-server applications, but as technology progressed, we deftly transitioned to .NET applications. One standout achievement was the laboratory application, for which we developed a staggering 4800 automation scripts—quite the feat during that era!

 

Version Control of Scripts: The Pillar of Collaboration

To ensure smooth sailing, we implemented an effective version control system. Team members diligently checked in their automation scripts daily, ensuring continuous collaboration and seamless workflow. A designated Tool Administrator was responsible for performing a monthly “Get Latest” version of all baseline scripts and pushing them to the test management tool.

And let’s not forget about documentation! We made sure that any changes to the code were meticulously recorded in the script header and modification histories. This practice proved invaluable, providing a clear trail of changes and facilitating better tracking over time.

 

High Script Stability and Low False Failure Rate: Trusting the Automation

You know, one of the main reasons for our test automation success was the stability of our scripts. We invested effort in their meticulous maintenance and constant updates, resulting in an impressively low false failure rate—never exceeding one percent!

You might wonder how we achieved this. Well, we believed in keeping our automation scripts busy! We had junior members of the team regularly execute the scripts, keeping them up-to-date and relevant. Not only did this boost script stability, but it also gave our juniors valuable hands-on experience. Win-win!

 

Review and Analysis of Test Results: Digging Deeper

After every formal regression run, our team delved deep into the results, seeking valuable insights. We crafted detailed test evidence documents with step-by-step information and accompanying screenshots, making analysis a breeze.

Failures were carefully categorized into two groups: false failures and genuine product defects. The former were added to the maintenance backlog for timely resolution. And guess what? We even had secondary reviews by Functional Subject Matter Experts (SMEs) to ensure meticulous scrutiny of the outcomes.

 

The Power of Domain-Driven Automation: A Collaborative Approach

Ah, let’s talk about the heart of healthcare automation—domain knowledge! Our automation group had a unique blend of software test engineers and product subject matter experts (SMEs). Together, we worked hand in hand with doctors, pathologists, and blood bank specialists, gaining comprehensive insights into the healthcare domain.

The results were impressive. By involving domain SMEs directly in the automation process, our scripts not only showcased technical robustness but also aligned perfectly with the unique intricacies of healthcare.

 

Test Automation Scripts Reviews and Coding Standards: The Path to Perfection

We never took regression test case selection lightly! The company had a stringent process in place, backed by a comprehensive guideline/checklist to handpick the best candidates for automation. Our testers had access to ready-to-use standards and guidelines, tailored to the scripting language used in the project.

Two types of reviews ensured the highest quality. The technical review focused on identifying gaps with established standards, programming logic, and overall code quality. The second review brought in domain SMEs to verify test accuracy, assess documentation comprehensiveness, and ensure the scripts were production-ready and bug-detecting champs.

 

Reusable Code to Speed up Development: Efficiency Unleashed

Let’s face it, automation can be time-consuming. But we had an ace up our sleeve—a vast library of reusable functions. These little gems expedited the development of new scripts, slashing the time required for script creation.

And we didn’t stop there. We provided a comprehensive reusable function guide, with numerous examples to illustrate proper usage. Efficiency soared as testers effortlessly incorporated these functions into their designs.

 

Gold Test Environment and Maintenance: The Key to Reliability

For seamless execution, we designed our test scripts to be independent of specific environments. They could thrive anywhere! During testing, we ran the scripts in an environment mimicking the most common customer configuration settings. Efficiency at its finest!

To ensure reliability, we had a standard set of maintenance data and dictionaries at our disposal. The Automation Team Tool Administrator took charge of maintaining and updating configurations based on releases and customer changes, safeguarding our “Gold Copy” environment from any unwarranted data anomalies.

 

Test Data Independence and Scripts: Agile and Adaptable

Ah, test data—a crucial aspect in automation. We thoughtfully organized and stored the required test data in dedicated tables within a database. JDBC connections seamlessly integrated this data with our automation framework.

The utilities managing the data were a dream. They could create multiple data sets for each test script, and if the data within the tables got depleted, they autonomously generated additional data. This test data independence was a game-changer, ensuring smooth execution in any new environment and boosting the overall efficiency and reliability of our testing.

 

Adoption of Ghosting and Virtualization Technologies: The Parallel Revolution

We faced a daunting challenge: running a significant number of scripts efficiently. We knew we needed parallel execution to tackle this beast. While Dockerization and cloud computing were yet to make their debut, we explored alternatives.

We embarked on “ghosting,” creating clean OS images using Norton Ghost software. These images formed the basis of machines in our test lab, with each machine serving as a dedicated testing environment. We had an impressive array of 30 to 40 machines, all running automation scripts simultaneously.

Technology marched on, and we adopted virtualization. Physical machines made way for virtual ones, revolutionizing our parallel execution capabilities. Flexibility and resource utilization soared, optimizing our testing environment and trimming hardware costs.

 

Conquering Complexity: The Rolling Regression Approach

A major challenge we faced during the initial stages of our test automation journey with the healthcare company. Picture this: multiple enhancements and bug fixes happening simultaneously, all tied to different projects slated for the integration testing phase. Talk about complexity overload!

 

Our team came up with a brilliant solution—the “rolling regression” approach. Here’s how it worked: whenever code from a project got merged with the release branch, we initiated a cycle of regression testing for that project. And guess what? We kept repeating this process for each project until we had tested them all. This approach had some serious perks!

Firstly, it helped us stay ahead in the game, keeping our regression test scripts updated with all the latest changes. No more outdated scripts causing confusion. Secondly, by continuously testing each project’s code as it merged into the release branch, we caught regression issues early on. Think about nipping problems in the bud!

 

The “rolling regression” approach proved to be a game-changer. It allowed us to tackle the challenges of concurrent project developments head-on, boosting overall efficiency and ensuring a smoother integration testing phase. Cheers to proactive problem-solving!

 

The Lives of Functional Testers Made Easy

As we ventured deeper into automation, we realized the value of user-friendly interfaces and tools. Customized dashboards and reports made automation test results accessible and comprehensible to functional testers and stakeholders alike. Quicker decision-making and issue resolution became the norm.

We also knew that staying updated was critical. Regular training sessions and workshops kept our team abreast of the latest automation tools and techniques. We were always hungry for knowledge!

 

Conclusion: Striking the Perfect Balance

My journey with this healthcare company has been a remarkable one. Test automation proved to be a game-changer, cutting down months of manual testing to just a few days. But it wasn’t just about the tools and technology; it was about striking the perfect balance.

The collaboration between automation experts and domain SMEs was a key ingredient. We combined technical proficiency with domain knowledge to create an automation strategy that optimized testing efforts and produced high-quality and reliable results.

As the software testing world evolves, I carry these invaluable lessons with me. Striving to leverage the best of both worlds, I continue to learn about impactful and successful test automation initiatives from my colleagues and friends. And I encourage you all to do the same—let’s shape the future of test automation together!

Leave a Reply