Every Lab is a Regulated Lab… and that may be Good News!
Article Jul 17, 2014
In a recent editorial (May 2014, http://www.technologynetworks.com/blogs.aspx) Helen Gillespie discussed the point that regulations on laboratory work are a feature in almost every laboratory and industry. I’d like to elaborate on that point.
When you discuss regulations with people in labs or lab management you get the sense that regulations are something that is being done to them, and that they should be met with minimal effort needed to meet them. Companies don’t like to do things that increase operating costs or that appear to slow down the pace of research and production. There is another side to the regulatory picture: that regulations and guidelines give us a roadmap of how labs should be run. How to manage a lab isn’t something that is covered in undergraduate work, and most people learn about it on the job or through seminars.
The understanding of regulatory requirements varies a lot with the organizational maturity of companies. Those that have been in place for some time, have usually been through the wringer of missing regulatory requirements. They have modified their culture to include regulatory compliance as part of the way work is done. Other labs in startup phases lack that culture unless senior management has been exposed to it and builds it into work practices. One regulatory component that is encountered in the use of laboratory computer systems and automation programs is that of validation, whether it be for methods or systems implementing those methods. Researchers and IT support people have stated that “we’re in research so we don’t have to validate systems”. Really? You may not be held accountable for systems validation, but that doesn’t mean you shouldn’t do it.
The novel “Containment” by Christian Cantrell contains the statement: “The only thing worse than no data is data that you cannot trust“. The point of validation is to build trust into the data your lab is producing. Data you can’t trust will bias your thinking and have the potential to send you down lines of work that can be a waste of resources, divert you from productive work, and lead to flawed decisions.
This is of particular concern with instrument-data system combinations whose data processing is done by algorithms whose behavior is subject to user-defined parameters. Too often users will use the default conditions that a vendor sets up without confirming that they are appropriate for their particular application. This can be a matter of not knowing that these parameters exist or not understanding their use.
When we move to automated systems, the need to thoroughly test, document and prove both methods and implementation becomes more acute. Well-designed, implemented, and validated systems can meet lab needs by providing high-quality data. That data will advance projects, and support decisions about further work. Systems that do not meet those criteria can produce results that may be unreliable.
ISO 17025 focuses on competence issues in laboratory work and provides a basis for accreditation programs. The point is to attest to the technical competence of those working within labs that provide testing and calibration work. That same level of competence is also needed in research applications. During the 2012 European Lab Automation meeting (Hamburg, May 30th 2012) Dr. John Bradshaw (Artel, Inc.) gave a talk titled “The Importance of Liquid Handling Details and Their
Impact on your Assays” covering the details of using the proper components in liquid handling systems (tips, etc.) and making sure that systems were properly calibrated and used correctly – particularly in manual pipetting. Unless the performance of liquid handlers is measured and controlled, results can vary widely and you don’t have data you can trust; his last bullet reads “Not Measuring = Not Knowing”. Knowing requires people who are educated in the use of equipment, validated procedures, and equipment that is routinely checked to meet laboratory standards. Yes, it takes time and there is a cost for training, validation, and equipment verification. The result is data you can have confidence in and that can be used and relied upon for decision-making. What is that worth to you?
The modern regulatory environment grew out of a need for labs to upgrade their level of performance, provide solid data that was reliable and to build confidence into decision making. Guidelines shouldn’t be viewed as an unnecessary expenditure of time and resources, but as a sanity check on whether labs are operating according to modern standards of performance.
The Swiss EPFL’s Blue Brain Project is a vast effort with the goal of digitally reconstructing and simulating the mouse brain and ultimately, the human brain. The recent publication of a Cell Atlas of the mouse brain sounds exciting, but what can the Atlas tell us, and can it bring us closer to a simulated brain?READ MORE
We often come across terms such as "Lab of the Future" and "Smart Lab" – but what do these actually mean? We recently spoke to Dr Patrick Courtney, member of the Board of Directors of SiLA (Standards in Laboratory Automation) to learn more about the value of implementing laboratory robotics and automation.READ MORE
Gauging global attitudes to science and health is no easy task, but it was the goal of The Wellcome Global Monitor – a survey of over 140,000 people aged 15 and older, from more than 140 countries. The findings present an unprecedented view of the relationship between science and society worldwide, which were discussed by leaders at the launch event in Washington D.C.READ MORE