Every Lab is a Regulated Lab… and that may be Good News!
Article Jul 17, 2014
In a recent editorial (May 2014, http://www.technologynetworks.com/blogs.aspx) Helen Gillespie discussed the point that regulations on laboratory work are a feature in almost every laboratory and industry. I’d like to elaborate on that point.
When you discuss regulations with people in labs or lab management you get the sense that regulations are something that is being done to them, and that they should be met with minimal effort needed to meet them. Companies don’t like to do things that increase operating costs or that appear to slow down the pace of research and production. There is another side to the regulatory picture: that regulations and guidelines give us a roadmap of how labs should be run. How to manage a lab isn’t something that is covered in undergraduate work, and most people learn about it on the job or through seminars.
The understanding of regulatory requirements varies a lot with the organizational maturity of companies. Those that have been in place for some time, have usually been through the wringer of missing regulatory requirements. They have modified their culture to include regulatory compliance as part of the way work is done. Other labs in startup phases lack that culture unless senior management has been exposed to it and builds it into work practices. One regulatory component that is encountered in the use of laboratory computer systems and automation programs is that of validation, whether it be for methods or systems implementing those methods. Researchers and IT support people have stated that “we’re in research so we don’t have to validate systems”. Really? You may not be held accountable for systems validation, but that doesn’t mean you shouldn’t do it.
The novel “Containment” by Christian Cantrell contains the statement: “The only thing worse than no data is data that you cannot trust“. The point of validation is to build trust into the data your lab is producing. Data you can’t trust will bias your thinking and have the potential to send you down lines of work that can be a waste of resources, divert you from productive work, and lead to flawed decisions.
This is of particular concern with instrument-data system combinations whose data processing is done by algorithms whose behavior is subject to user-defined parameters. Too often users will use the default conditions that a vendor sets up without confirming that they are appropriate for their particular application. This can be a matter of not knowing that these parameters exist or not understanding their use.
When we move to automated systems, the need to thoroughly test, document and prove both methods and implementation becomes more acute. Well-designed, implemented, and validated systems can meet lab needs by providing high-quality data. That data will advance projects, and support decisions about further work. Systems that do not meet those criteria can produce results that may be unreliable.
ISO 17025 focuses on competence issues in laboratory work and provides a basis for accreditation programs. The point is to attest to the technical competence of those working within labs that provide testing and calibration work. That same level of competence is also needed in research applications. During the 2012 European Lab Automation meeting (Hamburg, May 30th 2012) Dr. John Bradshaw (Artel, Inc.) gave a talk titled “The Importance of Liquid Handling Details and Their
Impact on your Assays” covering the details of using the proper components in liquid handling systems (tips, etc.) and making sure that systems were properly calibrated and used correctly – particularly in manual pipetting. Unless the performance of liquid handlers is measured and controlled, results can vary widely and you don’t have data you can trust; his last bullet reads “Not Measuring = Not Knowing”. Knowing requires people who are educated in the use of equipment, validated procedures, and equipment that is routinely checked to meet laboratory standards. Yes, it takes time and there is a cost for training, validation, and equipment verification. The result is data you can have confidence in and that can be used and relied upon for decision-making. What is that worth to you?
The modern regulatory environment grew out of a need for labs to upgrade their level of performance, provide solid data that was reliable and to build confidence into decision making. Guidelines shouldn’t be viewed as an unnecessary expenditure of time and resources, but as a sanity check on whether labs are operating according to modern standards of performance.
Motivated by a deadly chemical attack in Syria, researchers at the University of Texas at San Antonio are pursuing research that may help save lives during an airborne chemical attack. The team used the Stampede2 supercomputer and innovative computer simulation models to replicate the dispersion of the chemical gas released in the Syrian event, which may help improve evacuation speed during future attacks.READ MORE
The rate at which new drugs are discovered is in decline, and computer-aided drug design has not produced the radical change in the success of candidate drugs the pharma industry anticipated. Artificial intelligence (AI)-based approaches are the new hope for revolutionizing the drug discovery process. But are these, as computer-aided drug design was, an over-hyped time bomb waiting to be found out? Or is it truly a revolution on the cusp of realizing its promise?READ MORE
Healthcare interoperability remains an intractable problem; both healthcare providers and patients feel unease about limited access to medical data, which results in lower care quality and poor patient engagement. Although daunting, these challenges aren’t insurmountable. In this feature, we’ll explore what has been and what could be done to foster the creation of an interoperable infrastructure.READ MORE