We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.


Improving Time-to-Value for GxP Computer Systems

Cartoon of a tablet displaying scientific data.
Credit: mcmurryjulie, Pixabay
Listen with
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 4 minutes

In today's fast-moving world of vaccines, gene therapies and biotherapeutics, computerized systems play a critical role in the discovery, manufacture and delivery of potentially life-saving drugs. Delays in implementing systems, however, are not uncommon. Time-to-value is a critical metric when implementing a new computer system in the biopharma industry as delays may have a major impact on the delivery of potentially life-saving drugs to the patients who need them.

Validating computerized systems is required by regulatory agencies, such as the U.S. Food and Drug Administration (FDA) and the European Medicines Agency (EMA), for systems used as part of GxP regulated activities. Computer system validation can be labor-intensive and time-consuming due to many factors including resource availability, the intended use of the system (scope), and a company’s validation processes and procedures. Validation of a new laboratory information management system (LIMS), to help you better manage samples and analyze test result data, could take several months and possibly a year to complete. That’s months before your company’s software investment begins providing value and during which employees across the organization will focus on computer system validation and implementation activities rather than doing the jobs they were hired to do.

By getting value from new computer systems earlier and freeing up resources, biopharma companies can speed up the development, manufacture, testing and delivery of life-saving drugs. Although eliminating this time-to-value lag is not possible, evaluation and continuous improvement of your validation process can help speed up a system validation and implementation considerably.

A risk-based approach

The final guidance on General Principles of Software Validation, issued in 2002, states that validation activities should be commensurate with the design complexity and risk associated with using the software for a specific intended use. ISPE GAMP®5: A Risk-Based Approach to Compliant GxP Computerized Systems gives further guidance on this topic and a risk-based approach has been widely adopted throughout the industry. Even though this has become industry standard over the past two decades, the latest draft guidance from the FDA signals their perception that this approach is not being applied correctly or fully in all cases. A related concern is that corporations are not leveraging it to focus software quality assurance efforts on the truly critical components and functions.

As a first step, companies should take a step back to ensure that their validation process is truly risk-based and focused on testing and implementing mitigation efforts on the functionality that has the greatest potential to negatively impact patient health.

Best practices

Regardless of how streamlined your process is, software validation still requires effort and careful planning. Below are some potential opportunities to improve your process.

Leverage your vendors

Saving time in the validation process is all about balance. Companies need to ensure the process is both complete and comprehensive while simultaneously not reinventing the wheel. Many vendors provide documentation and services to support their customers’ validation processes.  With the growth of software as a service, vendors are positioned to take a greater role in assisting customers with their validation efforts. When selecting a new software vendor, consider the quality of their validation documentation as part of the service you purchase. This means that instead of starting from scratch, you can piggyback on what already exists and instead focus on how this software functions in your company’s unique environment and any new functionalities.

Try automation

Any change within a regulated environment takes time to implement, due to the need to understand the impact of the change, and to execute testing to ensure that the system is still functioning as expected. To reduce this change management effort, a more automated approach may be used. Leveraging test automation to execute test scripts can dramatically reduce timelines during implementation and downtime during changes, while maintaining, at minimum, the same level of quality as the traditional manual testing approaches. Many vendors offer automated operational qualification (OQ) testing at implementation.

Some may provide services to support customer specific performance qualification (PQ) testing in order to help customers verify their use cases. The PQ often requires a significant investment, in time and valuable resources, to develop and run test scripts. The PQ test scripts may be candidates to consider for automation to support the system through its entire lifecycle.

The goal isn’t to automate everything—some aspects of validation will continue to require a hands-on approach. But you might be able to save time and money by automating tests that are often repeated and unlikely to change.

Focus on incremental change

Moving to a new software vendor that offers quarterly updates can be daunting. With such frequent updates, it might feel like there’s no way to maintain the needed controls and confidence that your system is performing as expected and maintained in a validated state. But the regular updates can also force your documentation and change management processes to become more on point and focused on what truly matters. The approach also provides an opportunity for continuous improvement that makes your company more agile. This way, you’re not spending your resources for an entire year doing a significant validation project to upgrade a system, and can take advantage of new software features as quickly as possible.


Preparing for the future

Your organization’s software validation process doesn’t have to be perfect to provide utility and value. Asking yourself regularly whether the process meets your needs or requires improvement can help you keep your GxP systems up to date and running smoothly. By evaluating and continuously improving how your company approaches the software validation process, it should be possible to reduce the overall burden or eliminate the pain points of your current software validation process while still fulfilling regulatory requirements. Ultimately, a strong validation strategy will help speed up time to market of life-saving drugs and benefit patients who need them.  


About the author:


Jim Brooks is a GxP compliance professional with IDBS. He is accountable for defining processes and procedures as well as coordinating and executing activities required to deliver qualified GxP systems and solutions to customers of IDBS. During his long-standing years in biopharma, he has been responsible for validating, deploying, and managing laboratory information systems in GxP regulated environments.