We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

The Route to Compliance: How To Ensure Regulatory Compliance in GxP Laboratories

The Route to Compliance: How To Ensure Regulatory Compliance in GxP Laboratories content piece image
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 8 minutes

Ensuring both data integrity and regulatory compliance are imperative when working in any GxP regulated environment.  Regulatory inspections are now focusing specifically on the integrity of data as well as issuing guidance on the subject1,2. A 2019 update to the FDA Compliance Program Guide (CPG) 7346.832 for Pre-Approval Inspections (PAI) listed specific areas and activities that could be the result of data falsification in regulated laboratories that an inspector could focus on3.  The emphasis in the regulatory guidance documents and in the new GAMP Good Practice Guide on Data Integrity by Design4 is to use technical controls in software applications to ensure regulatory compliance and data integrity.

In this article, we will review how assessing your lab's workflows, identifying weaknesses and implementing digital solutions can streamline processes and make regulatory compliance an easier goal. 

Déjà vu all over again


Earlier this year I published an article on how a laboratory information management system (LIMS) can help ensure data integrity5 that outlined some of the technical controls that should be in place to ensure data integrity and compliance with the ALCOA+ principles.  It outlined the three general automation principles:
  1. Data acquisition at the point of origin
    Always interface instruments to acquire analytical data
  2. Never transcribe data
    All data must be transferred electronically between systems to avoid transcription errors
  3. Know where data are stored
    This may involve location and file naming conventions so that data can be retrieved easily

Expanding these principles, I discussed the following areas for a LIMS implementation: 

  • Abolish paper: If retained, paper files create an unwanted hybrid system
  • Interface instruments: This is to ensure data are acquired electronically
  • Automate calibration checks: This ensures that an instrument is fit for use on the day of analysis
  • Slaughter spreadsheets: Spreadsheets typically require printing and manual data entry plus transcription error checks. Cutting them out simplifies the process.
  • Work electronically: The use of electronic signatures helps here
  • Unique users: All users must have their own user identities and have access privileges that do not create conflicts of interest e.g. analysts with administrator access
  • Use audit trails: Adequate audit trail(s) help monitor changes to data and, if allowed, data deletions

Can these principles be applied to other informatics solutions in a regulated laboratory?  Yes, they can, and we’ll look at three anonymized case studies showing this in practice. 

Data Integrity in the Pharma Space

To ensure the quality and safety of pharmaceutical products, contract research and contract manufacturing organizations (CRO and CMO) must pay close attention to the principles of data integrity in their practice. This whitepaper provides an overview of the key principles of data integrity in the pharma space and how informatics solutions can best be deployed to make sticking to these principles an easier task.

Download Whitepaper

Assessing and eliminating data vulnerabilities


Before discussing case studies of laboratory informatics, we should outline that data process mapping is the best way of ensuring data integrity either for implementing a new system or the assessment of an operational one. Here, the data flows (i.e. inputs, transformation and outputs) in a process are mapped and the vulnerabilities of the records assessed.  For example, if an application stores data on a local hard drive in directories in the operating system, the vulnerabilities could be hard drive failure, ability to access data behind the application or data deletion. This leads to assessment of the risks and the implementation of technical controls to eliminate the risks e.g. data acquisition to secure network storage with user access only via the informatics application.  Data process mapping has been used in two of the case studies presented here to eliminate data vulnerabilities and improve the business process.

Data process mapping

  • Shared user identities in a computerized system require individual user accounts to attribute work to a specific person
  • Files including spreadsheets stored in directories on a standalone workstation require better protection from alteration, potential loss and time travelling.  This would require networking the workstation to ensure secure time synchronization as well as data being stored on resilient and secure network drives.

Data process mapping is particularly important as the FDA are now requiring the identification of hardware and software vulnerabilities in two recent warning letters issued in July 202010,11.

Beyond SOPs – Exploring the Rigor and Requirements of Analytical QBD

The application of Quality by Design (QbD) has become second nature to the pharmaceutical industry. The resulting benefits prompt the question of whether QbD principles are applicable to other processes and analytical method development is now a focus. Just like conventional QbD, analytical QbD (AQbD) holds out the prize of flexibility, in contrast to the rigidity of Standard Operating Procedures (SOPs). This whitepaper introduces AQbD and examines how the process of developing and validating analytical methods can benefit from the systematic and scientific approach that QbD promotes.

View Whitepaper

Case studies introduction


We will focus on the use of technical controls in applications to redesign a process to eliminate paper and use electronic signatures.  Short-term remediation of existing implementations using procedural controls will not be considered.  To ensure success, a laboratory must evaluate any informatics application to demonstrate that the software can support their proposed process.  




Figure 1: Comparison of CDS Hybrid and Electronic Workflows

Case study 1: Chromatography data system (CDS)

  • There are three sets of electronic records (CDS and two spreadsheet files)
  • Three hybrid systems where the e-records must match the corresponding paper printouts
  • Spreadsheets do not have regulatory audit trail functions
  • Multiple manual data inputs are required and hence transcription error checks

The evolution of this process in this laboratory is surprising, given that calculation of system suitability test (SST) parameters and the use of sample weights and dilutions for final result calculation are standard functions of any CDS application. Data process mapping identified major data integrity vulnerabilities as well as process improvements.

Further benefit in data integrity and business efficiency can be obtained by interfacing the CDS to a LIMS to download sample identities and sample weights for the analysis12.   

Enabling Data Integrity from Drug Discovery through Manufacturing

When evaluating the success of a drug therapy, biopharmaceutical companies are increasingly putting data quality and data integrity at the top of their list. High data quality and integrity is critical to ensure patient safety and drug efficacy from research and development through to manufacturing. This whitepaper examines data integrity at every stage of the drug development process.

Download Whitepaper

Case study 2: Automating traditional wet chemistry

Our second case study involves a wet chemistry laboratory. Traditional wet chemistry analyses can be automated by using involving smart instruments such as balances, automatic titrators, pH meters and melting point apparatus linked to an instrument data system. This approach to automation uses each instrument’s screen as a terminal to the data system.

The instrument data system, from the supplier of the analytical instruments, is a platform for instrument interfacing and database which was installed and qualified first.  Then, individual automated workflows were specified, developed and validated according to the laboratory’s needs. If a workflow required that a specific instrument must only be used for an analytical procedure, a performance check could be performed; if out of limits, the instrument was blocked until serviced.   Workflows could be implemented that integrated multiple instruments, for example a balance weighing of a sample was integrated into a Karl Fischer titration. Electronic signatures ensured that paper printouts were no longer required. The instrument data system can also be interfaced with a LIMS.

Case study 3: Automating sample preparation


Our final case study was part of a major project in a multinational pharmaceutical company to automate all QC laboratories globally. One of the areas where automation proved difficult was sample preparation.  There are many options available (dilution, homogenization, extraction, etc.) depending on the physical state of the sample and how the extract needs to be introduced to the analytical instrument. 

To automate sample preparation a laboratory execution system (LES) with mobile tablets was used. An LES takes the stages of an analytical procedure and automates the sample preparation step by step.  Sometimes an analytical procedure was not sufficiently detailed and additional steps had to be added, or the written instruction contained additional detail that needed to be broken down into more actions in the LES. The benefits, however, were clear. The data integrity and business advantages of using the LES were:
  • Elimination of paper records for sample records
  • No need to control blank paper forms2,13
  • Attribution of actions to individuals
  • Contemporaneous recording of data

If some sample preparation involved analytical instruments such as balances or pH meters, these were interfaced to the LES.  The system can check if the instrument is qualified and can also require a calibration or point of use check before samples are weighed.  To ensure that data are collated to the appropriate sample records easily, the LES was linked to a LIMS database and then to an Enterprise Resource Planning (ERP) system.

Cubis® II - Pharma Compliant by Design

Data compliance regulations cover every stage of pharmaceutical development and the use of equipment that is compliant by design is essential to avoid infringement and penalties. Download this whitepaper to discover a modular lab balance system with built-in compliance. The system combines high-performance weighing with full end-to-end data integrity.

Download Whitepaper

Summary


Whilst laboratory informatics systems can eliminate data integrity vulnerabilities, it is also important to understand that if applications are used correctly and the analytical process improved by elimination of hybrid systems and paper records, significant business benefits can be obtained through process efficiencies.

References

  1. WHO Technical Report Series No.996 Annex 5 Guidance on Good Data and Records Management Practices. Geneva: World Health Organization; 2016. Available from: https://www.who.int/medicines/publications/pharmprep/WHO_TRS_996_annex05.pdf?ua=1             .
  2. PIC/S PI-041-3 Good Practices for Data Management and Integrity in Regulated GMP / GDP Environments Draft. Geneva: Pharmaceutical Inspection Convention / Pharmaceutical Inspection Cooperation Scheme; 2018. Available from: https://picscheme.org/users_uploads/news_news_documents/PI_041_1_Draft_3_Guidance_on_Data_Integrity.pdf.
  3.  FDA Compliance Program Guide CPG 7346.832 Pre-Approval Inspections. Silver Spring: MD, Food and Drug Administration: 2019. Available from: https://www.fda.gov/media/71498/download.
  4. ISPE’s GAMP Community of Practice. GAMP Good Practice Guide: Data Integrity by Design. Tampa, FL: International Society for Pharmaceutical Engineering; 2020.
  5. R.D.McDowall. How To Use a LIMS To Improve Compliance.Sudbury, UK: Technology Networks; 2020. Available from: https://www.technologynetworks.com/informatics/how-to-guides/how-to-use-a-lims-to-improve-compliance-335941.
  6. R.D.McDowall, Data Integrity and Data Governance: Practical Implementation in Regulated Laboratories. Cambridge: Royal Society of Chemistry; 2019.
  7. GAMP Guide Records and Data integrity. Tampa, FL: International Society for Pharmaceutical Engineering; 2017.
  8. GAMP Good Practice Guide:  Data Integrity - Key Concepts. Tampa, FL: International Society for Pharmaceutical Engineering; 2018.
  9. R.D.McDowall, Data Integrity Focus II:  Using Data Process Mapping to Identify Integrity Gaps. LCGC N.America37(2): p. 118-123. Available from: https://www.chromatographyonline.com/view/data-integrity-focus-part-ii-using-data-process-mapping-identify-integrity-gaps.
  10. FDA Warning Letter Tender Corporation. U.S. Food and Drug Administration: Available from: https://www.fda.gov/inspections-compliance-enforcement-and-criminal-investigations/warning-letters/tender-corporation-599789-07232020. Published July 23, 2020.
  11. FDA Warning Letter Stason Pharmaceuticals, IncU.S. Food and Drug Administration: Available from: https://www.fda.gov/inspections-compliance-enforcement-and-criminal-investigations/warning-letters/stason-pharmaceuticals-inc-604889-07082020. Published July 8, 2020.
  12. R.D.McDowall, How Can LIMS Help Ensure Data Integrity? LCGC Europe29(6): p. 310-316.Available from: https://www.chromatographyonline.com/view/how-can-lims-help-ensure-data-integrity.
  13. FDA Guidance for Industry Data Integrity and Compliance With Drug CGMP Questions and Answers. Silver Spring, MD: Food and Drug Administration; 2018.

 About the author

Bob McDowall is director of  R D McDowall Limited.