The future laboratory operates digitally, where data is electronically collected and stored, and equipment is computer-controlled. This shift aims to curb errors inherent in manual data collection. Embracing automation and digital solutions enhances efficiency and accuracy in laboratories.
Explore current trends in laboratory digitization with this Expert Insights publication, tailored for decision-makers and practitioners. Discover Cubis® II as a pioneering solution for the future's digital laboratory.
Download this eBook to explore:
- Compliance and connectivity solutions in the lab
- How the simple weighing process can be improved through new technologies
- Practical approaches to digitization
Editorial Dear Readers, The lab of the future is digital. Correct data is fundamental in a quality control environment. Inaccuracies can lead to wrong decisions, compromising product safety and quality. It can have a negative impact on product costs and more importantly on patient health and safety. Today’s laboratories are changing rapidly to prevent this problem by becoming “smarter”, by automating technologies and using digital science solutions. Some of the main challenges in this process of change include: • Maintaining data integrity • IT integration • Creating efficient workflows This eBook presents an overview of compliance and connectivity solutions and shows how the simple weighing process in the lab can be improved through new technologies that are leading to higher sample throughputs, workflow efficiency, and reproducibility of data and results. The content of this eBook consists of: • A summary of Wiley’s book “Laboratory Control System Operations in a GMP Environment”, discussing a practical approach to the implementation of connectivity and compliance. • Summaries of articles on data integrity within the biopharmaceutical sector, data integrity of healthcare information, and digitalization in laboratories of the pharmaceutical industry. • Case studies • Interview with a product specialist Enjoy the read! Róisín Murtagh, Editor at Wiley Analytical Science Stay Compliant and Connected with the Latest TechnologyBACK TO CONTENTS Laboratory control system operations in a GMP environment Adapted from David M. Bliesner Book In recent years, the US Food and Drugs Administration (FDA) has observed violations of the current good manufacturing practice (cGMP) regulations concerning data integrity. The FDA defines data integrity as complete, consistent, and accurate data, which should be attributable, legible, contemporaneously recorded, original or genuine copy, and accurate (ALCOA). In addition, other organizations have included a “+” to the acronym which adds “complete, consistent, enduring, and available”. On the other hand, data governance is defined as the sum of arrangements, which assure data integrity. These arrangements will ensure a complete, consistent, and accurate record throughout the data lifecycle. Data governance should be integral to the pharmaceutical quality system, should address ownership, and consider the design, operation, and monitoring of processes/systems to comply with the principles of data integrity, including control over intentional and unintentional changes to, and deletion of information. Data governance and data integrity have become a cottage industry within the regulatory compliance and cGMP consulting community. This level of increased scrutiny and required actions is a positive development for the industry but has a large source of information to attempt, understand, and apply. This chapter aims to address the specifics of data governance and data integrity within the laboratory control system (LCS). Description of the laboratory data governance and data integrity The purpose of the data governance system is to implement policies and procedures, which allow the full reconstruction of good manufacturing practice (GMP) activities by retrieving complete information relating to the production, testing, and release of a manufactured batch of drug or drug product. The data governance and data integrity of the laboratory should include at least nine individual components. I. Policy for data governance will be unique to each organization and should contain the following elements: • Explanation of the purpose of the data governance system • Graphical description of the data lifecycle (see Fig.1) • Description of roles and responsibilities • Link the data governance system to the LCS and its sub-elements • Lists the related standard operating procedures (SOPs) • Lists of data governance and data integrity • Descriptions of components shown in the data governance and data integrity hierarchy (see Fig.2): • Procedural controls • Technical controls • Data maps and data walks • Risk identification, ranking, and filtering • Data review • Data and operational audits • Employee awareness and training • Management oversight • Regulatory and industry data integrity references II. List operational procedures, which are SOPs, and provide guidance on subjects that impact directly or indirectly on how data are generated, processed, reviewed, reported, stored, retrieved, achieved, and destroyed. Examples of typical SOP titles associated with laboratory operations that impact data integrity are: • Laboratory document control system • Laboratory good documentation practices • Installation, operational, and performance qualification (IQ/OQ/PQ) of laboratory equipment • Laboratory equipment lifecycle management 5 Stay Compliant and Connected with the Latest TechnologyBACK TO CONTENTS Figure 1 Fig. 1: The lifecycle of laboratory data. Figure 2 Fig. 2: Data Governance. 6 • Laboratory building and facilities security and access control • Facilities disaster recovery plan • Analytical test method validation • Verification of compendial procedures • Electronic records and signatures • Electronic records storage, backup, archival, and restoration • General procedures for computer system validation • Chromatographic data acquisition software • Electronic laboratory notebooks (ELNs) • Laboratory information management system (LIMS) • Computer system change control procedures • Validation of spreadsheets • Validation of databases • Power failure recovery procedures for computers • Disaster recovery of electronic data and computer equipment • Computer system integrity • Operational maintenance of computer systems and software • Overview of data governance and data integrity • Data source and data mapping • Application of hazard analysis critical control points (HACCPs) to laboratory data integrity • Personnel compliance program for insuring laboratory data integrity • Workflow, sample management, tracking, trending, and release of analytical data • Analytical data review and approval • Conducting, documenting, and reporting laboratory investigations, out-of-specification (OOS) and out-oftrend (OOT) investigations • QA oversight and monitoring of production • QA oversight and monitoring of QC laboratory operations III. List technical controls, inherent within system hardware and software, to prevent or restrict users from unauthorized or inadvertent manipulation or deletion of data. Verification has to be through qualification or validation procedures. Technical controls are always preferable to procedural controls because they exclude or limit the manipulation or deletion of data or records. FDA recommendations emphasize the restriction on the ability to alter specifications, process parameters, data, or manufacturing or testing methods (for example, by limiting permissions to change settings or data). Stay Compliant and Connected with the Latest TechnologyBACK TO CONTENTS IV. Description of processes to assess and evaluate existing controls over data. This is the generation of workflow diagrams of the production and testing steps during the manufacturing of the product (data maps). Data maps should be as comprehensive and detailed as possible to ensure that all risks to data integrity have been identified. After data mapping, it will be necessary to execute more detailed mapping exercises with the laboratory (data walks). Each QC instrument/testing process should be mapped to illustrate how data is created, modified, reported, and managed. Data walks can be used as a training program. V. Risk identification, ranking, and filtering are performed through gap analyses and risk assessment tolls. The former includes all documentation of observed gaps with linkage to the steps in the data maps, and the latter used to be accomplished in numerous fashions. Table 1 shows an example of a risk ranking and filter (RRF) tool, which can be used to construct a data and operational audit program. VI. Data review is a standard component of any cGMP laboratory and needs to be driven by SOP. There are three levels of data review: • Bench level • Supervisor/management level • Quality assurance (QA) VII. As mentioned earlier, a data and operational audit program; long-term and QA-led, could be implemented. Taking into account the risk assessment, this program could include: • Identifying critical oversight points for quality assurance (QA-COP) • Determining a required level of QA oversight Level 1: Routine compliance chal- lenges Level 2: Minor compliance challenge Level 3: Major compliance challenge Level 4: Critical compliance challenge Level 5: Out-of-control compliance challenge • Determining the type of oversight requires: • Samples and data review • Observation of execution of the individual tasks • Interview personnel who perform individual work tasks Table 1 C1 = Data Criticality = Impact on decision making, product quality and patient safety, namely: (A) Does the data influence important decisions? (B) What is the impact of the data on product quality or patient safety? High = 10 Medium = 5 Low = 1 NA = 1 R1= High = 10 Medium = 5 Low = 1 NA = 1 D1 = Easily Detected = 1 Might be Detected = 5 Difficult to Detect = 10 NA = 1 F1= Rarely = 1 Occasionally = 5 Frequently = 10 NA = 1 Risk to Data = Susceptibility to unauthorized: (A) Alteration of data and records (B) Deletion of data and records Detectability = Likelihood of detection/visibility of changes, alterations or deletion of data and records with existing data integrity procedures and practices Frequency = A qualitative (or if data is available, quantitative) sense of how often the observation or practice occurs within the organization over time: (A) Isolated incident (B) Periodically occurs (C) A Reoccurring issue High Risk Score = > 5000 Red Medium Risk Score = Between 1000 and 5000 Yellow Low Risk Score = < 1000 Green Table 1: An example of a risk ranking and filtering (RRF) tool. • Confirm the performance of work tasks • Challenge the operation of selected LCS sub-systems • Witness execution of compliance related to sub-systems • Creating and implementing audit plans and integrating the audits into the regular, routine QA oversight function (Fig.3). VIII. Enhancing employee awareness and training on the principles of data governance and data integrity is of paramount importance within pharmaceutical companies. The most effective manner to accomplish this is through direct management involvement. The following outline serves as a starting point for the management design of the data governance and data integrity training program. However, each organization must have its own unique requirements that suit its individual needs: • History of data governance and data integrity in the pharmaceutical industry • Regulations regarding data integrity • Definition of data governance and data integrity • Importance of data integrity • Policy and procedures regarding data governance and data integrity • Employee data integrity confidential reporting mechanisms • Timelines and timing regarding data governance and data integrity instructions IX. The FDA is very clear on management oversight of the data governance system and data integrity. Management at all levels must create and implement a quality culture to prevent lapses in data integrity. Some examples of steps or actions that management could take: • Develop a functional understanding of the production system