Every Lab is a Regulated Lab… and that may be Good News!
Article Jul 17, 2014
In a recent editorial (May 2014, http://www.technologynetworks.com/blogs.aspx) Helen Gillespie discussed the point that regulations on laboratory work are a feature in almost every laboratory and industry. I’d like to elaborate on that point.
When you discuss regulations with people in labs or lab management you get the sense that regulations are something that is being done to them, and that they should be met with minimal effort needed to meet them. Companies don’t like to do things that increase operating costs or that appear to slow down the pace of research and production. There is another side to the regulatory picture: that regulations and guidelines give us a roadmap of how labs should be run. How to manage a lab isn’t something that is covered in undergraduate work, and most people learn about it on the job or through seminars.
The understanding of regulatory requirements varies a lot with the organizational maturity of companies. Those that have been in place for some time, have usually been through the wringer of missing regulatory requirements. They have modified their culture to include regulatory compliance as part of the way work is done. Other labs in startup phases lack that culture unless senior management has been exposed to it and builds it into work practices. One regulatory component that is encountered in the use of laboratory computer systems and automation programs is that of validation, whether it be for methods or systems implementing those methods. Researchers and IT support people have stated that “we’re in research so we don’t have to validate systems”. Really? You may not be held accountable for systems validation, but that doesn’t mean you shouldn’t do it.
The novel “Containment” by Christian Cantrell contains the statement: “The only thing worse than no data is data that you cannot trust“. The point of validation is to build trust into the data your lab is producing. Data you can’t trust will bias your thinking and have the potential to send you down lines of work that can be a waste of resources, divert you from productive work, and lead to flawed decisions.
This is of particular concern with instrument-data system combinations whose data processing is done by algorithms whose behavior is subject to user-defined parameters. Too often users will use the default conditions that a vendor sets up without confirming that they are appropriate for their particular application. This can be a matter of not knowing that these parameters exist or not understanding their use.
When we move to automated systems, the need to thoroughly test, document and prove both methods and implementation becomes more acute. Well-designed, implemented, and validated systems can meet lab needs by providing high-quality data. That data will advance projects, and support decisions about further work. Systems that do not meet those criteria can produce results that may be unreliable.
ISO 17025 focuses on competence issues in laboratory work and provides a basis for accreditation programs. The point is to attest to the technical competence of those working within labs that provide testing and calibration work. That same level of competence is also needed in research applications. During the 2012 European Lab Automation meeting (Hamburg, May 30th 2012) Dr. John Bradshaw (Artel, Inc.) gave a talk titled “The Importance of Liquid Handling Details and Their
Impact on your Assays” covering the details of using the proper components in liquid handling systems (tips, etc.) and making sure that systems were properly calibrated and used correctly – particularly in manual pipetting. Unless the performance of liquid handlers is measured and controlled, results can vary widely and you don’t have data you can trust; his last bullet reads “Not Measuring = Not Knowing”. Knowing requires people who are educated in the use of equipment, validated procedures, and equipment that is routinely checked to meet laboratory standards. Yes, it takes time and there is a cost for training, validation, and equipment verification. The result is data you can have confidence in and that can be used and relied upon for decision-making. What is that worth to you?
The modern regulatory environment grew out of a need for labs to upgrade their level of performance, provide solid data that was reliable and to build confidence into decision making. Guidelines shouldn’t be viewed as an unnecessary expenditure of time and resources, but as a sanity check on whether labs are operating according to modern standards of performance.
Durham University Seeks to Unlock Mysteries of the CosmosArticle
To unravel the mysteries of the cosmos, a team at Durham University is using an ambitious open-source project, dubbed EAGLE-XL, that aims to simulate our entire universe in a level of detail never attempted before.READ MORE
Tech Solutions for Drinking Water AnalysisArticle
Improved access to clean and safe drinking water is key for health and survival and has been classified as a human right. Here, we take a look at some of the novel solutions, both hi-tech and low-tech, that scientists are designing to help monitor drinking water source safety.READ MORE
Can Supercomputers Break Down Data Silos?Article
The University of Southampton have recently unveiled their new supercomputer, Iridis. We spoke to Oz Parchment, the University's Director of iSolutions, to find out how Iridis's power can be used to break through barriers preventing collaboration in research.READ MORE