We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Data Reporting, Integrity and Compliance

A globe surrounded by zeroes and ones to represent data
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 9 minutes

In this article, we examine how analytical scientists can enhance their data sharing and reporting practices, leading to improvements in data integrity and an easier route to compliance within a regulated good manufacturing practice (GMP) environment.


To ensure the integrity and compliance of reported data, each GMP-regulated laboratory should ensure that any process from sampling to reporting is fully controlled.  The best way to achieve this is to use the technical controls in laboratory informatics software to achieve data integrity and regulatory compliance.


To illustrate this, we will explore how two departments regulated by GMP, analytical development and quality control, should work and, where appropriate, collaborate.  While these two departments may appear to have similar analytical roles, they have different objectives due to their function in development and production respectively. Both departments are responsible for analysis of raw materials, intermediates and formulations. Analytical development is also responsible for generating regulatory submission data for product approval while quality control must generate annual product reviews (APR) under 21 CFR 211.180(e) or product quality reviews (PQR) for EU GMP Chapter 1.10.1,2 The differences between the two reviews are that the former assesses representative batches whilst the latter must review all batches. The two departments must also collaborate for technology transfer of analytical procedures and any follow-up troubleshooting and therefore must share data and reports with each other. The focus in this article will be on the development, validation and application of analytical procedures and not on confirmation of new molecular entity (NME) chemical structure.


To help in preparation of this article, Dr. Christine Mladek from Boehringer Ingelheim Pharma GmbH & Co.KG (BI) and Dr. Markus Dathe from F. Hoffman La Roche AG  (Roche) were interviewed for their views and experiences in these areas. 

Functions of analytical development and quality control

All outputs from both departments are totally dependent on the trustworthiness, reliability, integrity and compliance of the underlying records and data in both paper and electronic formats used to generate them. Furthermore, speed is also a key factor – analytical development can accelerate time to register and quality control can accelerate time to release product.

Diagram of functions of analytical development and quality control

Figure 1: Overview of the functions of analytical development and quality control within pharmaceutical research, development and production.

Table 1: Comparison of analytical development and quality control department functions.


Moving Your Lab? Make Science Your Priority

Whether you are looking to relocate or just curious, this case study will discover an end-to-end relocation service that could help you by providing expertise and experience at every stage, reducing audit risk with standardized protocols and qualification services and guaranteeing, postmove performance of all assets.

View Case Study


Problems with report data integrity and compliance

However, some of the barriers to achieve the goals listed in Table 1 are:

  • Paper processes: Blank forms are not acceptable in a regulated environment. To meet standards, they must be uniquely numbered, issued and reconciled.3,4 This requires a high administrative overhead to use while ensuring compliance.  Data on paper are difficult to share.
  • Hybrid systems: These consist of electronic records that are linked to signed paper printouts; again, the data on a paper printout is difficult to share. The controls required to ensure data integrity are higher cost than an electronic system due to the two-record media used.4
  • Spreadsheets: These are widely available and easily used and abused; spreadsheets are ubiquitous in most laboratories, but they are hybrid systems. Typically, data are entered manually which results in the need for transcription error checking.
  • Transcription error checks: Data manually entered into a computerized system or transcribed from one printout to another must be checked carefully, creating bottlenecks, especially when errors are found and must be corrected and rechecked.


All of these barriers combine to ensure that any business process in either department is slow and inefficient. 

Principles of digitalization/automation

To ensure that a report meets integrity, quality, speed and compliance criteria, it is essential to automate or digitalize the process(es) that acquire, process, calculate and report the results. When designing electronic workflows, three principles of laboratory digitalization must be followed:

  1. Data acquisition at the point of origin
    Eliminate paper records
    Always interface instruments to acquire and process analytical data
  2. Never transcribe data
    All data must be transferred electronically between systems using validated processes to avoid making transcription errors
    Eliminate spreadsheets by using calculations in informatics applications
  3. Know where the data are stored
    This may involve location and file naming conventions so that data can be retrieved for audit or inspection easily
    This is essential for reporting results of analyses but also technology transfer reports and generating analytical regulatory submission packages and PQRs


Data process mapping should be used to identify data vulnerabilities in the current process as well as process inefficiencies. By using the three automation principles above, the vulnerabilities can be eliminated and data integrity and regulatory compliance achieved. This results in a more efficient and effective process to aid the compliance, integrity, quality and speed of reporting.


Figure 2 illustrates this process by showing before and after chromatographic analyses where calculations have been incorporated into the chromatography data system (CDS) and electronic signatures have been implemented. There is an option to print the final report if required.

Diagram of chromatography analysis to eliminate spreadsheet calculations

Figure 2: Redesign of a chromatography analysis to eliminate spreadsheet calculations and paper printouts.

Why Digitalize Your Regulated Laboratory?

Digitalization, the use of informatics applications to transform the ways of working, can provide both regulatory compliance and business benefits to a lab. Download this eBook to explore topics including data integrity within a pharmaceutical quality system, a flexible analytical data life cycle and business drivers for digitalization.

 View eBook


Sharing informatics applications?

Although the analytical processes in the two departments appear similar, one approach is to share the same informatics applications such as a laboratory information management system (LIMS), laboratory execution system (LES) or electronic lab notebook (ELN). There are some obvious advantages; a single system validation covering the two departments, information sharing and negotiating license costs. Although the main analytical processes are similar, there are differences that need to be considered.


Analytical development is at the start of the development process and, as analytical procedures are still being developed, require flexible working, meaning that any competent analyst can work on analysis. In contrast, quality control uses registered analytical procedures and may want to limit an analyst to specific analytical procedures for which they have been certified.


These are very practical considerations that should be used to select the right application for the job.  Can the two departments with different working practices be accommodated? This must be made clear in the selection process and evaluated with shortlisted systems. One possibility would be to establish two groups with different user roles and access privileges in a selected application. The issue then becomes how to share data between the two for analytical method transfer etc. 

Remote working triggers system design changes

The COVID-19 pandemic has taught us that remote working requires electronic data and communication. It is not practical or realistic to send paper records between a laboratory and an analyst’s house, as the metadata including audit trail entries must be included. Moreover, standalone systems do not facilitate remote working as the data are difficult to share. Therefore, as working practices evolve, system design must evolve from standalone to network operation. The ability to review all data and metadata remotely and sign reports electronically is a critical driver for laboratories to allow effective remote working for data sharing and collaboration.


Suppliers are market driven and if customers don’t ask for features like this, they won’t be delivered.

Technology transfer and data sharing

One area where it is essential for the two departments to collaborate is technology transfer of analytical procedures. In an ideal world, technology transfer would be facilitated by using the same analytical instruments and software so that the instrument parameters can be transferred electronically alongside example data. This would give the receiving laboratory more detail than is usually in a written document. This process can easily become a car crash where the originating department sends the receiving laboratory the analytical report and leaves them to it. What better ways are there for collaborating between the two departments that BI and Roche can tell us?

  • Dr. Markus Dathe stated that the ideal approach is to share the overall validation between the two departments. The main work is conducted by analytical development but intermediate precision involves quality control staff working with their instruments in their laboratories. Both departments are co-signatories of the final validation report.
  • Alternatively, Dr. Christine Mladek suggested that a member of the receiving laboratory could work in the originating laboratory to learn and understand the procedure and speed establishment in their own laboratory, simplifying the transfer protocol.
  • Both interviewees agreed that having the same instrument data system means that data from development and validation experiments can be easily shared between the two departments.
  • Ideally the same make and model of instruments should be present in both departments to aid transfers. One problem noted by Mladek was that difficulties were encountered when trying to transfer gas chromatography (GC) methods when the originating and receiving laboratories used different supplier’s instruments.
  • Roche’s analytical development support their quality control colleagues for five years after transfer of the procedure. BI quality control staff need access to development records in case quality control identify an impurity in production and need to validate it with development. Therefore, on-going data sharing and collaboration is essential and critical to success for both organizations.

The Race for Reproducible Data

Download this infographic to discover the facts behind the reproducibility crisis, find out why publication practices have made reproducibility harder to achieve and learn how automation and FAIR practices can enhance data reproducibility.

View Infographic


Analytical development: regulatory submissions

Development of analytical procedures requires exploratory work that may lead to further experimentation or a dead end. Development of the design space, including identification of parameters that need control and critical quality attributes (CQA), is important for the validation.


Data and reports from the development and validation of analytical procedures for NMEs as well as summary results from analysis of the validation batches from initial production are used in a regulatory submission for new products. It’s important from both companies’ perspectives that these data be accessible by quality control.


If a marketing authorization is required for the United States market, the FDA will perform a pre-approval inspection following compliance program guide (CPG) 7346.832.5 Here, the three objectives (readiness for commercial manufacturing, conformance to the application and a data integrity audit) all involve assessment of the integrity of the data submitted to the FDA and if relevant, any referenced data are also in scope.  Rapid retrieval of electronic data is required for the inspector to review and therefore it is essential to know where the electronic data are located.

Quality control: product quality reviews

Under EU and FDA GMP regulations companies are required to perform APRs (FDA GMP) or PQRs (EU GMP). There are differences between the two types of review:

  • 21 CFR 211.180(e)(1): “A review of a representative number of batches, whether approved or rejected, and, where applicable, records associated with the batch.”1
  • EU GMP Chapter 1 clause 1.10: “Regular periodic or rolling quality reviews of all authorized medicinal products, …. objective of verifying the consistency of the existing process, the appropriateness of current specifications for both starting materials and finished product, to highlight any trends and to identify product and process improvements. Such reviews should normally be conducted and  documented annually.”2


The EU requirement for PQR is more comprehensive and requires substantial data retrieval and trending of data. The requirement for a PQR must be designed into laboratory informatics applications so that a multitude of spreadsheets are generated for PQRs.


Mladek and Dathe mentioned that both their companies are involved in the development of additional databases or data lakes for handling PQR data.  Data are abstracted from the source systems such as enterprise resource planning (ERP) and LIMS then transferred to the data repositories. It is important to understand that data integrity must be maintained during the transfer so that any conclusions made are based on firm data. Furthermore, both interviewees mentioned that interpretation of analytical data needs review by analytical scientists who understand the meaning of the data rather than statisticians who do not.

Summary

Collaboration between regulated analytical development and quality control laboratories requires integrity and compliance of the data shared. Method transfer between the two departments is enhanced by using the same informatics solutions. This facilitates ease of data sharing, not just during the method transfer process but in the years afterwards, in case of problems and unknown impurities observed.

References

1.  21 CFR 211 Current Good Manufacturing Practice for Finished Pharmaceutical Products. 2008, Food and Drug Administration: Sliver Spring, MD.

2. EudraLex - Volume 4 Good Manufacturing Practice (GMP) Guidelines, Chapter 1 Pharmaceutical Quality System. 2013, European Commission: Brussels.

3. FDA Guidance for Industry Data Integrity and Compliance With Drug CGMP Questions and Answers 2018, Food and Drug Administration: Silver Spring, MD.

4. PIC/S PI-041 Good Practices for Data Management and Integrity in Regulated GMP / GDP Environments Draft. 2021, Pharmaceutical Inspection Convention / Pharmaceutical Inspection Cooperation Scheme: Geneva.

5. FDA Compliance Program Guide CPG 7346.832 Pre-Approval Inspections. 2019, Food and Drug Administration: Silver Spring, MD.