Every Lab is a Regulated Lab… and that may be Good News!
Article Jul 17, 2014
In a recent editorial (May 2014, http://www.technologynetworks.com/blogs.aspx) Helen Gillespie discussed the point that regulations on laboratory work are a feature in almost every laboratory and industry. I’d like to elaborate on that point.
When you discuss regulations with people in labs or lab management you get the sense that regulations are something that is being done to them, and that they should be met with minimal effort needed to meet them. Companies don’t like to do things that increase operating costs or that appear to slow down the pace of research and production. There is another side to the regulatory picture: that regulations and guidelines give us a roadmap of how labs should be run. How to manage a lab isn’t something that is covered in undergraduate work, and most people learn about it on the job or through seminars.
The understanding of regulatory requirements varies a lot with the organizational maturity of companies. Those that have been in place for some time, have usually been through the wringer of missing regulatory requirements. They have modified their culture to include regulatory compliance as part of the way work is done. Other labs in startup phases lack that culture unless senior management has been exposed to it and builds it into work practices. One regulatory component that is encountered in the use of laboratory computer systems and automation programs is that of validation, whether it be for methods or systems implementing those methods. Researchers and IT support people have stated that “we’re in research so we don’t have to validate systems”. Really? You may not be held accountable for systems validation, but that doesn’t mean you shouldn’t do it.
The novel “Containment” by Christian Cantrell contains the statement: “The only thing worse than no data is data that you cannot trust“. The point of validation is to build trust into the data your lab is producing. Data you can’t trust will bias your thinking and have the potential to send you down lines of work that can be a waste of resources, divert you from productive work, and lead to flawed decisions.
This is of particular concern with instrument-data system combinations whose data processing is done by algorithms whose behavior is subject to user-defined parameters. Too often users will use the default conditions that a vendor sets up without confirming that they are appropriate for their particular application. This can be a matter of not knowing that these parameters exist or not understanding their use.
When we move to automated systems, the need to thoroughly test, document and prove both methods and implementation becomes more acute. Well-designed, implemented, and validated systems can meet lab needs by providing high-quality data. That data will advance projects, and support decisions about further work. Systems that do not meet those criteria can produce results that may be unreliable.
ISO 17025 focuses on competence issues in laboratory work and provides a basis for accreditation programs. The point is to attest to the technical competence of those working within labs that provide testing and calibration work. That same level of competence is also needed in research applications. During the 2012 European Lab Automation meeting (Hamburg, May 30th 2012) Dr. John Bradshaw (Artel, Inc.) gave a talk titled “The Importance of Liquid Handling Details and Their
Impact on your Assays” covering the details of using the proper components in liquid handling systems (tips, etc.) and making sure that systems were properly calibrated and used correctly – particularly in manual pipetting. Unless the performance of liquid handlers is measured and controlled, results can vary widely and you don’t have data you can trust; his last bullet reads “Not Measuring = Not Knowing”. Knowing requires people who are educated in the use of equipment, validated procedures, and equipment that is routinely checked to meet laboratory standards. Yes, it takes time and there is a cost for training, validation, and equipment verification. The result is data you can have confidence in and that can be used and relied upon for decision-making. What is that worth to you?
The modern regulatory environment grew out of a need for labs to upgrade their level of performance, provide solid data that was reliable and to build confidence into decision making. Guidelines shouldn’t be viewed as an unnecessary expenditure of time and resources, but as a sanity check on whether labs are operating according to modern standards of performance.
A growing number of life science businesses are turning to greater supply chain collaboration for benefits like accelerated time to market, improved quality, reduced risk and more rapid and widespread innovation. But while 68% of executives in this industry say active and meaningful engagement with suppliers is essential to success, far too many, over a third, struggle to implement it.READ MORE
Machine learning modeling is one of the most eagerly adopted technologies across healthcare. An important technology in this area is robot-assisted surgery, where the hope is that AI’s rapid evolution will soon allow machine learning models to enhance current surgical practice. This article reviews the current and close future applications of machine learning in burn surgery and microsurgery.READ MORE
New developments in pharmaceuticals have made new drugs and treatments available, enhancing options and quality of life for patients. Advanced data analytics solutions mean treatments are more effective and affordable, and less intrusive. However, these evolutions will mean major changes in how companies interact with patients and caregivers, and this will necessitate new capabilities for operations and supply chain.READ MORE