Lab of the Future: Trends and Technologies
eBook
Published: November 22, 2024
Credit: Technology Networks
The introduction of AI, automation and sustainable practices is reshaping lab operations and addressing challenges from data integrity to environmental impact, enhancing efficiency and reducing costs.
Yet, with these changes come complexities – balancing automation and human oversight, maintaining stringent compliance and ensuring data security in increasingly digital lab environments.
This eBook dives into the latest trends and technologies driving the evolution of labs.
Download this eBook to discover:
- How AI and machine learning are streamlining data handling and advancing research
- The latest in data compliance technology for secure, robust and efficient lab operations
- Sustainable lab practices that reduce environmental impact without sacrificing research quality
Credit: iStock
ADAPTING TO CHANGE
IN MODERN CLINICAL
LABS
HOW AI AND MACHINE
LEARNING CAN IMPROVE
SCIENTIFIC DATA
HANDLING
REDEFINING LAB
PRACTICES TO PRIORITIZE
SUSTAINABILITY
LAB OF THE
FUTURE
Trends and Technologies
SPONSORED BY
CONTENTS
5
How AI and Machine Learning Can
Improve Scientific Data Handling
8
The New Technologies Changing the
Face of Data Integrity in Drug Discovery
11
Top Ten Tips for Data
Compliance
16
Adapting to Change in Modern Clinical Labs: An Interview With Sean Tucker
19
How AI Is Revolutionizing Scientific
Research
23
Redefining Lab Practices To Prioritize
Sustainability
28
The Adoption of AI: Critical Concerns in the Life Sciences
31
The Museum of Analytical
Antiquities
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 3
TECHNOLOGYNETWORKS.COM
FOREWORD
The “Lab of the Future” is a key topic across industries, including the pharmaceutical sector, with
discussion of what Pharma 4.0 will be. The insights in this eBook will help you on your journey into this
transformative landscape.
Components of the Lab of the Future are many and varied. A future-ready lab integrates automation,
comprehensive data capture, streamlined data management and proactive insights. In regulated
industries, this work must be in compliance with applicable regulations and guidance. Coupled with
these approaches to automation are the elimination of paper printouts and transcription error checking.
Data must be acquired electronically at the beginning of the process and remain electronic until the final
result is generated.
Artificial Intelligence (AI) and machine learning (ML) are essential to the Lab of the Future, driving
advancements in fields such as cancer research, materials science and drug discovery. Laboratories
generate vast amounts of data and AI can be used to help extract essential information and knowledge,
however, there are major concerns around the implementation of AI in the pharmaceutical industry and
questions remain on how best to validate these tools for compliance.
New technologies support compliance, but fostering a quality-first culture is essential, starting with
senior management and extending across teams. Despite fears around revalidation costs, updating
computerized systems is necessary to prevent obsolescence.
Another consideration for any Lab of the Future is sustainability and environmental impact. How can
scientists adapt or introduce changes to meet these twin requirements?
Finally, although the focus so far has been on technologies, the missing part of the equation are the
scientists. Introducing any new working practices or automation will result in change to working
practices. How can this be managed effectively across diverse, multigenerational teams?
We hope this eBook inspires and equips you with the knowledge to advance toward a Lab of the Future.
Bob McDowall, PhD
PREPARE YOUR LAB
FOR THE FUTURE
TRADE UP to the Latest
Technologies from Waters
Now is the perfect time to future-proof
your lab and replace your obsolete and
aging LC or MS systems.
Don’t let budget restrictions stand in the way
We o
er finance options and flexible financing solutions to match payments to
funding, bridge budget cycles and optimize your budget.
Contact us today
to learn more about Waters eligible trade-in o
ers. We’ll help you create a
replacement plan that is right for your lab.
Ensure reliable operation
Conform to regulatory guidelines
Meet evolving business requirements
Benefit from Waters service and support expertise
New Technology you can rely on
Stay
Compliant
Avoid
Risk
Improve Your
Productivity
waters.com/TradeUpNow
Waters is a trademark of Waters Technologies Corporation.
All other trademarks are the property of their respective owners.
©2024 Waters Corporation
5 LAB OF THE FUTURE
Artificial intelligence and machine learning (AI/ML)
have become widespread in scientific research. A subset
known as generative AI (GenAI) that can produce
detailed text and images with just a few human prompts
was thrust into the general public’s consciousness with
the November 2022 release of ChatGPT. A year later,
ChatGPT became the first nonhuman entity to make
the journal Nature’s annual list of the 10 most important
contributors to science.
A 2023 survey by the same journal found that 30% of
postdoctoral researchers were employing AI chatbots to
generate and edit code, manage literature and refine the
text of their scientific papers. More recently, an Elsevier
survey of corporate scientific R&D professionals
indicated that 96% believe AI will accelerate knowledge
discovery and 93% expect AI to lower business costs.
More than 85% said that these technologies would
improve work quality and free up time to pursue
higher-value projects. However, similarly high numbers
expressed concerns about misinformation, errors and
reduced critical thinking.
GenAI is merely one type of AI/ML, though the lines
between GenAI, predictive AI and machine learning
— that must be trained on specialized datasets — have
been blurring. While forecasts can vary widely, the AI/
ML field as a whole has been exploding. A 2023 analysis
suggested that the global AI market share would grow
by about 19% annually to more than $2.575 trillion by
2032. Another analysis predicted a 14% annual growth
in the market size of AI just for clinical trials, to $4.4
billion, in 2032.
How AI and Machine Learning
Can Improve Scientific Data
Handling
Neil Versel
Credit: iStock
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 6
Nvidia, which makes high-performance computing chips
that fuel AI development, briefly surpassed Microsoft
and Apple in June 2024 as the world’s most valuable
company based on market capitalization.
The technology is not without growing pains, however,
as evidenced by an Nvidia sell-off that shaved $500
billion in value less than two weeks later.
Another Nature article suggested that ChatGPT may be
“corrupting” scientific peer review. In a Harvard Data
Science Review study, researchers found that ChatGPT
creator OpenAI’s GPT-4 large language model became
less accurate between March and June 2023, while
its supposedly less advanced predecessor, GPT-3.5,
improved.
As developers work through these issues, scientific
researchers are finding all kinds of applications for AI/
ML. This listicle will explore how AI/ML is being used
in different areas of scientific research to accelerate
discoveries and improve efficiency.
Applications and benefits of AI/ML
in scientific data handling
Data collection and management
AI can automate data collection and curation for
biomedical research, as well as streamline analysis
at speed and scale no human could achieve. Notably,
predictive AI has been shown to improve the detection
of anomalies and outliers, particularly from medical
images. Generative AI might be useful for upscaling
images to “super-resolution” by creating simulated data
to fill in gaps. It can also summarize massive datasets to
accelerate hypothesis development.
Writing code
As ChatGPT is less than two years old, digital health
developers are just beginning to understand its utility.
Recent literature has found that ChatGPT and other
GenAI technologies can act as a “coding copilot” to
fill in technical expertise gaps at medical and scientific
institutions and allow for more rapid development,
testing and troubleshooting.
Investigators from China and South Korea reported
in the journal iMeta that GenAI systems, “excel in
identifying and correcting errors in code, quickly
pinpointing syntactical and logical issues, and providing
explanations, thereby accelerating debugging and
serving as an educational resource for developers.”
Target discovery
In 2021, researchers from India’s National Institute
of Pharmaceutical Education and ResearchAhmedabad stated that AI had already begun to
“revolutionize” the pharmaceutical industry. According
to their paper, which appeared in Drug Discovery Today,
AI-based predictive models can quickly recognize hit
and lead compounds by predicting molecular structures,
accelerating drug target discovery and validation.
Similar models can also be applied to drug repurposing.
A more recent paper from researchers at Insilico
Medicine in Hong Kong discussed how AI’s strength
in analyzing massive multiomic datasets and biological
networks — including GenAI-derived synthetic data —
is helping to shorten discovery timelines by mitigating
the historical trade-off between high-confidence and
novel targets. There are even specific AI tools that
can quantify both novelty and confidence, helping
investigators choose more promising drug targets.
Clinical trial matching
AI is starting to show its utility in clinical trial design
and setup, particularly when it comes to recruiting
subjects within the confines of complex inclusion and
exclusion criteria by analyzing patient medical records
to predict likely outcomes. It can also help extend
trials to historically underserved populations. So far,
the application of AI for trial matching has been most
prominent in oncology.
Notably, the US National Institutes of Health (NIH)
and its affiliated centers including the National Library
of Medicine (NLM) are testing several AI tools for trial
matching.
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 7
Notably, NLM created a prototype large language model
called TrialGPT specifically to match patients with trials
with similar accuracy but more rapidly than a panel of
physicians.
Genomics
Genomics is a natural application for AI and ML
because of the sheer size of the datasets involved. In
variant calling, there has been scientific evidence around
the efficacy of Google’s DeepVariant since at least
2018, but that is far from the only genomics-related
application of predictive and generative technologies.
“Diagnostic rates in hereditary disease, particularly
involving neurodevelopmental disability, have
improved substantially in recent years thanks to AIbased approaches,” a team from molecular laboratory
company Invitae reported in a 2023 review article in
the American Journal of Medical Genetics. Of note, a
group called the Genomic Testing Cooperative has
implemented AI-based algorithms to assist pathologists
in producing molecular profiles of their patients in
pursuit of differential diagnoses of hematologic and
solid tumors.
Another paper, from 2022, went so far as to say that AI
plus genomic and precision medicine “have the potential
to improve patient healthcare.”
Neuroscience
There is an emerging field that Cold Spring Harbor
Laboratory in New York has dubbed NeuroAI, which
focuses on the nexus between AI and neuroscience.
“The upcoming generation of scientists will need to
possess fluency in both domains,” a recent editorial in
Nature Machine Intelligence argued.
A 2023 review article in the journal Sensors explained
how brain-computer interface software can help
detect diseases, predict progression and even control
prosthetic devices. By automating data analysis, the
technology can also lighten the workload of overtaxed
radiologists, rendering them more productive and less
likely to err in image interpretation.
Environmental science
AI can help design more energy-efficient buildings,
monitor emissions of greenhouse gases and other
pollutants, as well as predict shifting weather and
climate patterns. This is not a new concept, as the
American Meteorological Society is preparing to host
its 24th annual conference on AI for environmental
science.
It should be noted, however, that the data centers
behind large AI models in medicine and other industries
can have massive carbon footprints due to their high
energy demands.
The future of AI/ML in scientific
data handling
Not quite two years after the introduction of ChatGPT,
masses of scientific researchers are embracing
AI/ML, including GenAI, for data collection and
management, writing computer code, sorting through
unprecedentedly large genomic datasets, identifying
new drug targets, expanding clinical trial pools and
addressing existential challenges such as climate change.
But many are doing so with a wary eye, as these
emerging technologies make mistakes, as evidenced
by the prevalence of GenAI “hallucinations.” Indeed,
in the Elsevier survey of corporate R&D professionals,
71% of respondents expect results from GenAI tools
to be based on “high-quality trusted sources only.”
A majority said that training an AI/ML model to be
factually accurate and not cause harmful mistakes would
“strongly increase their trust” in this technology. •
8 LAB OF THE FUTURE
Automation and drug development are a perfect pair.
Improved reliability and increased efficiency are
the most obvious benefits of automation, which can
overcome inconsistency and wasteful processes that
have been the biggest hurdles in the pharmaceutical
industry.
The well-worn figures of the field’s failures are stark
evidence for the need for updated technology – 90% of
drugs fail during development and as much as 60% of
research and development costs are attributable to attrition.
Bringing in technology that can improve these grim
statistics is a business imperative. That’s why the
introduction of Industry 4.0 - shorthand for the
application of technologies like automation, robotics
and machine learning to industrial processes – has been
rapid and redefining in drug discovery. One important
benefit of these innovations is their effect on the
robustness and quality of drug development information
or data integrity.
In this article, we’ll look at how data integrity can be
enhanced in a digitalized pharmaceutical field and the
nascent risks posed by embracing these technologies.
The benefits of Industry 4.0
Filipa Mascarenhas-Melo, an assistant professor at
the Polytechnic Institute of Guarda in Portugal, says
that Industry 4.0 innovations offer “unprecedented
opportunities” to pharmaceutical companies.
The New Technologies
Changing the Face of Data
Integrity in Drug Discovery
RJ Mackenzie
Credit: iStock
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 9
Enhanced tracking is one of the central promises of the
switch from analog to digital data. While handwritten
experimental reports or assay readouts can be lost,
damaged or forged, automated processes can be
calibrated to record and save each step of a process,
making analysis far easier for companies and regulators.
Dr. Stephen Goldrick, an associate professor of digital
bioprocess engineering at University College London,
called these changes “pivotal”.
“This evolution mitigates concerns related to manual
handling errors and enhances the ease of managing,
storing, retrieving and querying electronic batch
records,” he added.
Mascarenhas-Melo points to systems such as electronic
data management systems (EDMS) and laboratory
information management systems (LIMS), which enable
the tracking of electronic documents and samples
within a workflow. Blockchain-based systems, which
provide fixed ledgers that track data entries, can be used
to follow the whole pharma supply chain. This boosts
“transparency, authenticity and integrity of data related
to drug manufacturing, distribution and sales,” says
Mascarenhas-Melo.
Industry 4.0 tools have also simplified data recovery
after system failures, facilitated secure communication
and data encryption protocols and made quality
management protocols that ensure data integrity easier
to implement.
As digitalization improves workflows for
pharmaceutical companies, regulators in the space must
adapt their own processes to keep on top of these new
technologies. Mascarenhas-Melo co-authored a review
of good automated manufacturing practices, or GAMP,
a series of recommendations for the design of digital
systems published by the International Society for
Pharmaceutical Engineering (ISPE).
The first GAMP was created in response to the
emergence of digital workflows in 1991 and the latest
version – GAMP 5’s second edition, released in
July 2022 – bears little resemblance to its 33-yearold predecessor. “The latest edition of GAMP 5
acknowledges the non-linear, agile and cyclical nature of
modern software development, reflecting the industry’s
shift towards continuous iteration and innovation,”
comments Mascarenhas-Melo.
Regulators, she says, have to prioritize “agility and
responsiveness”. Despite this, the ISPE took 14 years to
release an update to GAMP-5’s first edition, published
in 2008. During this period, many companies’ digital
infrastructure fundamentally changed – whether you
consider the rise of cloud software offerings, which can
be updated and scaled more easily, or the proliferation
of AI tools in the field.
Evergreen principles guide
regulation
Even in the face of these fundamental changes, there
are data integrity practices that have proved to be
enduring cornerstones of regulation. The ALCOA
acronym, which emphasizes that good data should
be Attributable, Legible, Contemporaneous, Original
and Accurate, was first coined by the FDA’s Stan
W. Woollen in the early 1990s. ALCOA (albeit in a
modified form that also considers data’s completeness,
consistency, endurance and availability) still “stands out
for its effectiveness and widespread recognition within
the industry,” says Goldrick. By using these agnostic
principles as a guide, regulators are trying to make their
guidance more adaptable, even in the face of changing
“This evolution mitigates
concerns related to
manual handling errors
and enhances the ease
of managing, storing,
retrieving and querying
electronic batch records,”
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 10
technologies.
“The percentage of violations requiring action has
shown a decline, possibly indicating an improvement
in compliance efforts by pharmaceutical companies,”
Mascarenhas-Melo explains.
While regulators and companies have proved nimble
enough to navigate these changing technologies while
ensuring data integrity, Goldrick suggests making sure
staff and users come along for the ride is an essential
factor that risks being ignored. “The implementation
and maintenance of these systems demand significant
training for users, entailing considerable costs and time
dedicated to training,” he says.
Mascarenhas-Melo highlights new GAMP-5 guidelines
which encourage end users to liaise extensively with
suppliers to make use of their expertise in maintaining
digital systems. In turn, suppliers will have to adapt
to a new, continuous support model where they
work closely with end users long after a sale has been
completed.
The risks of Industry 4.0
Completely digitalized processes have security
vulnerabilities that suppliers and pharmaceutical
companies must prepare for, says Goldrick. “The
reliance on cloud storage elevates the industry’s
exposure to cybersecurity threats, including the
potential for unauthorized access to sensitive or patientspecific information,” he says.
These risks are already evident. A recent ransomware
attack on UnitedHealth, the world’s largest healthcare
company by revenue, involved threats to release patient
data and saw pharmacy operations seize up for days.
This attack happened concurrently with a strike on
pharmaceutical giant Cencora. Ensuring critical healthcare
data’s integrity, and training those asked to steward that
data, is the only way to protect against such attacks.
How to ensure data integrity in an
evolving landscape
Pharma’s match-up with Industry 4.0 technologies is
here to stay. The questions are no longer whether the
field will be changed by the technologies but rather
who will be first to leverage the new state of play
to their advantage: industry players, regulators or
cybercriminals.
To make systems secure and maximize data integrity,
pharmaceutical companies will have to invest in both
hardware and proper training for their staff. The costs
involved may make some hesitate, but MascarenhasMelo has a warning for companies who drag their
feet: “Those slow to adopt may find themselves at a
competitive disadvantage, struggling to catch up as their
more digitally mature counterparts forge ahead.” •
ABOUT THE INTERVIEWEES:
Dr. Filipa Mascarenhas-Melo is an assistant professor in the Higher
School of Health at the Polytechnic Institute of Guarda, Portugal and
an integrated researcher at the Institute’s BRIDGES - Biotechnology
Research, Innovation and Design for Health Products – program.
She is also a collaborating researcher in the Department of
Pharmaceutical Technology at the University of Coimbra.
Dr. Stephen Goldrick is a lecturer and associate professor in Digital
Bioprocess Engineering at University College London’s Department
of Biochemical Engineering. He specializes in the application of
mathematical modelling and advanced data analytics to processes
in the biotechnology field.
REFERENCES:
1. van der Graaf PH. Probability of Success in Drug
Development. Clin Pharmacol Ther. 2022;111(5):983–985.
doi: 10.1002/cpt.2568
2. Pedro F, Veiga F, Mascarenhas-Melo F. Impact of
GAMP 5, data integrity and QbD on quality assurance
in the pharmaceutical industry: How obvious is it?
Drug Discov Today. 2023;28(11):103759. doi: 10.1016/j.
drudis.2023.103759
3. Bongiovanni S, Purdue R, Kornienko O, Bernard R.
Quality in Non-GxP Research Environment. In: Bespalov
A, Michel MC, Steckler T, eds. Good Research Practice
in Non-Clinical Pharmacology and Biomedicine.
Springer International Publishing; 2020:1-17. doi:
10.1007/164_2019_274
11 LAB OF THE FUTURE
Data compliance in both regulated and non-regulated
laboratories includes data integrity (can you trust the
numbers) and data quality (can you make a decision).
Implicitly it also includes compliance with applicable
regulations or quality standards.1-3
Although there are
others, these are my “Top Ten Tips” for data compliance,
as shown in Figure 1, which illustrates the relationships
and interactions between them.
1. Data compliance
starts at the top
Any organization’s senior management is responsible
for the quality management system (QMS)1–6
and
therefore data compliance. Lack of senior management
involvement in the QMS will directly impact the
overall approach required by an organization – e.g.,
quality culture and ethos, the ability to admit mistakes
without blame, compliance procedures and training as
well as Gemba walks by senior management. Senior
management is also responsible for ensuring that systems
are current and comply with applicable regulations and
standards, non-compliant systems need to be assessed
and remediated.
2. Data integrity and compliance is
everybody’s job
Following on from Tip 1 is the message that data
compliance is everybody’s job. It’s an ongoing task that
demands constant attention. To keep data compliance
at the forefront, senior management, laboratory
management and supervisors need to reinforce it. One
way to do this is a Gemba walk, where managers talk
with staff directly to understand their challenges and
emphasize the importance of data compliance. Part of
Top Ten Tips for Data
Compliance
Bob McDowall, PhD
Credit: iStock
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 12
the Gemba walk involves talking with analysts who
work with analytical processes regularly, this can help
management understand where there are compliance
issues; which can be raised as improvement ideas.
Linked to Tip 1 is the ability to own up to mistakes and
then understand the root cause – perhaps there is a
problem with an SOP or workflow that is too complex
and needs to be modified.
3. Have you qualified the
instruments and validated
application software?
Analytical instruments and systems must be qualified
and/or validated for the intended use against operating
parameters defined in a laboratory user requirements
specification (URS) to show they meet their intended
use.7
Coupled with point of use or system suitability test
checks performed on the day of analysis (e.g., analytical
balance or pH meter). This means that the laboratory can
rely on the data generated by the instrument.
Many instruments are controlled by an application
installed on a separate computer that acquires, processes,
stores and reports results. This software needs to
be validated for intended use requiring a URS and
configuration of the software before testing against the
user requirements.
Documenting software configuration is important and
will cover:
• System settings: Use electronic signatures and
include the reason for a change in audit trail entries
• Workflows: Define workflows that enforce
procedures rather than relying on procedural
controls. This is discussed in more detail in Tip 7
• User roles: Confirm who can do what by defining
different user roles and their access privileges
• Account management: Ensure users are allocated
to roles with no conflicts of interest (e.g., users with
administrator privileges)
TOP TEN TIPS
FOR DATA
COMPLIANCE
Data Compliance
Starts at
the Top
Managing the
Human Factor
OOS Investigations
Complete Data
Means Complete
Data
Implement
Technical
Controls
Know Your Data
Lifecycles
Proble Processes
& Procedures
Validated
Analytical
Procedures
Qualified
Instruments &
Validated Software
Data Integrity
and Compliance Is
Everyone's Job
10
9
8
7
6
5
4
3
2
1
Figure 1: Interaction of the Top Ten Tips for Data Compliance. Credit: Technology Networks.
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 13
Both the URS and the configuration specification must
be placed under change control and updated when the
application is updated. An earlier article on how to handle
software updates is available.8
4. Validated analytical procedures
under actual conditions of use
With trained staff using qualified and validated analytical
instruments and systems, analytical procedures must
be developed and validated under actual conditions
of use.1,9,10 This includes procedures transferred from
another laboratory or pharmacopoeial procedure. Ideally,
experimental design software should define the design
space and identify critical parameters that need to be
controlled. The procedure is then validated and used to
deliver reliable results in operational use.
5. Problem processes and
procedures?
Are your business processes, both manual and
computerized, overly complex and difficult? Are your
computer systems hybrid (electronic records with
signed paper printouts)? Are calculations performed in
spreadsheets with manual data input?
These are signs that your processes and systems are
slow and inefficient. A second-person review of an
analysis could likely take longer than the analysis itself.
Paper printouts coupled with manual input to computer
systems result in multiple transcription error checks
which are slow and error prone. It is far better to have an
electronic and vavlidated process to streamline both the
analysis and review.
Are your SOPs and analytical procedures overly
complex? This could make it difficult for an analyst to
follow. On the flip side, if they’re overly simplistic an
analyst might be forced to make assumptions. Don’t wait
until there is a document review scheduled, get feedback
from the analysts who work with the procedures and
make any changes as necessary.
6. Know your data lifecycles
Each analytical procedure will generate data and
metadata but not all analytical procedures are created
equal.
Some simpler procedures are observation tests such as
color, appearance or odor. Here the data would be a single
observation along with the associated metadata such as
analyst, date/time, batch or lot number, etc.
For more complex spectroscopic and chromatographic
analytical procedures, the data lifecycle is more
complex with on-the-day checks of instruments, sample
preparation, instrumental analysis, interpretation and
generation of the reportable result. Knowing the data to
be collected for each procedure is critical for ensuring
that they are captured, interpreted and reported
correctly.11
7. Implement technical controls
As mentioned in Top Tip 2 about workflow clarity, there
are two options for controlling a process: procedural
controls (via an analytical procedure or SOP) or
technical controls (enforcing an electronic workflow in
an application). Of the two it is preferable to implement
and validate technical controls. Although there may be
an overarching procedure, it is the technical controls that
will enforce it.
Have you ever forgotten to initial and date a document
or missed a signature off a record? Technical controls
will ensure that after a user has logged on, all work will
be attributed to them via their user identity and date/
time stamped. If a record requires a signature, then the
workflow will ensure that this occurs at the correct
place in the workflow. A review cannot occur until the
performer of the work has electronically signed the
record set, furthermore, if the correct control is enabled
the performer cannot review and approve their own
work.
This approach, prioritizing technical over procedural
controls, has been identified for inclusion in the update of
EU GMP Annex 11.12
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 14
8. Complete data means complete
data
FDA GMP 211.194(a) requires complete data from
laboratory tests1
to ensure integrity and compliance.
This is simple to interpret: everything captured during
an analysis including all problems, instrument failures,
etc. If a procedure or data system does not specify all the
data and metadata to collect during an analysis, how does
a reviewer or quality assurance specialist know how the
data are complete? In contrast, GLP requires raw data
which is defined as original observations and activities
necessary to reconstitute the study report which is the
same expressed differently.2,13
It also means that analysts can’t be selective in what
is being reported – e.g., if one set of results is out of
specification you can’t ignore them and repeat the work
until a passing result is obtained. This would require an
investigation to determine the root cause, which leads us
to Top Tip 9.
9. Mishandling deviations including
OOS investigations
Laboratories are required to document and investigate
deviations,1,14 however, many laboratories label these
as incidents which is not a GMP term and is a way
of hiding deviations. The FDA has published two
versions of the Out-of-Specification (OOS) guidance
for the investigation of aberrant results15,16 that outlines
the process for laboratory investigations to find an
attributable root cause for the deviation. The major
problem with many investigations is that they are either
not scientifically sound or that the root cause is attributed
to “human error”. Too many of the latter raise questions
about the quality and training of analytical staff.
10. Managing the human factor
Even if you have Top Tips 1–9 implemented effectively in
a laboratory, it’s important to remember that all processes
and computerized systems are operated by humans,
who can make mistakes. Technical controls can reduce
but not eliminate problems caused either deliberately
or in error. This is where Top Tip 1 comes into play and
comes full circle back to how a laboratory trains its staff
for integrity and compliance. Staff must be educated
on what constitutes prohibited actions (e.g., reporting
selective results, deleting data, etc.) and what actions
are permitted. In addition, when mistakes are made,
they must be reported and documented. In a non-blame
culture, the reason for the mistake must be established
and looked at as an opportunity to improve.
Summary
As can be seen from Figure 1, data compliance is not a
single item but a series of interrelated activities that is
predicated by active senior management involvement.
Data compliance is everybody’s job and that must be
reinforced by management throughout the organization.
However, it comes down to how individuals work and act,
which is the key to data compliance.
REFERENCES
1. United States Food and Drug Administration. 21 CFR
211 - Current Good Manufacturing Practice for Finished
Pharmaceuticals. Silver Spring, MD: United States Food and
Drug Administration. 1978. https://www.ecfr.gov/current/
title-21/chapter-I/subchapter-C/part-211.
2. United States Food and Drug Administration. 21 CFR 58
- Good Laboratory Practice for Non-Clinical Laboratory
Studies. Silver Spring, MD: United States Food and Drug
Administration. 1978. https://www.ecfr.gov/current/title-21/
chapter-I/subchapter-A/part-58
3. International Organization for Standardization. ISO/IEC
17025:2017 - General requirements for the competence of
testing and calibration laboratories. Geneva: International
Organization for Standardization. 2017. https://www.iso.org/
obp/ui/#iso:std:iso-iec:17025:ed-3:v1:en
4. United States Food and Drug Administration. FDA Guidance
for Industry Data Integrity and Compliance With Drug CGMP
Questions and Answers. Silver Spring, MD: United States
Food and Drug Administration. 2018. https://www.fda.gov/
regulatory-information/search-fda-guidance-documents/
data-integrity-and-compliance-drug-cgmp-questions-andanswers
5. Pharmaceutical Inspection Co-operation Scheme.
PIC/S PI-041 Good Practices for Data Management and
Integrity in Regulated GMP / GDP Environments. Geneva:
Pharmaceutical Inspection Co-operation Scheme. 2021.
https://picscheme.org/docview/4234
6. World Health Organisation. WHO Technical Report
Series No.996 Annex 5 Guidance on Good Data and
Records Management Practices. Geneva: World Health
Organisation. 2016. http://academy.gmp-compliance.org/
guidemgr/files/WHO_TRS_996_ANNEX05.PDF
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 15
7. United States Pharmacopoeia. USP General Chapter <1058>
Analytical Instrument Qualification. Rockville, MD: United
States Pharmacopoeia Convention Inc. https://doi.usp.org/
USPNF/USPNF_M1124_01_01.html
8. McDowall RD. How to future-proof your LIMS: handling
software updates. Technology Networks. https://www.
technologynetworks.com/tn/how-to-guides/how-tofuture-proof-your-lims-handling-software-updates-370757.
Published 2023. Accessed April 2024
9. United States Pharmacopoeia. USP General Chapter <1220>
Analytical Procedure Lifecycle. Rockville, MD: United States
Pharmacopoeia Convention Inc. 2022. https://www.uspnf.
com/sites/default/files/usp_pdf/EN/USPNF/usp-nf-notices/
gc-1220-pre-post-20210924.pdf
10. International Council for Harmonisation of Technical
Requirements for Pharmaceuticals for Human Use (ICH).
ICH Q2(R2) Validation of Analytical Procedures, Step 4 Final.
Geneva: ICH. 2023.
11. McDowall RD. Data Integrity and Data Governance:
Practical Implementation in Regulated Laboratories. The
Royal Society of Chemistry; 2018. doi: 10.1039/9781788013277
12. European Medicines Agency. Concept Paper on
the Revision of Annex 11 of the Guidelines on Good
Manufacturing Practice for Medicinal Products –
Computerised Systems. 2022. https://www.ema.europa.eu/
en/documents/regulatory-procedural-guideline/conceptpaper-revision-annex-11-guidelines-good-manufacturingpractice-medicinal-products_en.pdf
13. Organisation for Economic Co-operation and Development
(OECD). OECD Series on Principles of Good Laboratory
Practice and Compliance Monitoring Number 1, OECD
Principles on Good Laboratory Practice. Paris: OECD. 1998.
14. European Commission. EudraLex - Volume 4 Good
Manufacturing Practice (GMP) Guidelines, Chapter
1 Pharmaceutical Quality System. Brussels: European
Commission. 2013. https://health.ec.europa.eu/document/
download/e458c423-f564-4171-b344-030a461c567f_
en?filename=vol4-chap1_2013-01_en.pdf
15. United States Food and Drug Administration. FDA Guidance
for Industry Out of Specification Results. Rockville, MD: US
Food and Drug Administration. 2006. https://www.fda.gov/
media/71001/download
16. US Food and Drug Administration. FDA Guidance for
Industry, Investigating Out-of Specification (OOS) Test
Results for Pharmaceutical Production. Silver Spring. MD: US
Food and Drug Administration. 2022. https://www.fda.gov/
media/158416/download
16 LAB OF THE FUTURE
As the technological advancements of the Lab of the
Future unfold, laboratory leaders are tasked not only
with implementing innovation but also with fostering
a flexible, multi-generational workforce in a rapidly
evolving field.
Sean Tucker, laboratory director of North Kansas
City Hospital, was invited to answer your questions
about the complexities of being a modern lab leader in
Technology Networks’ Ask Me Anything session.
Alex Beadle (AB): How can communication
methods be adapted to effectively engage
with all generations entering the lab?
Sean Tucker (ST): We don’t all fit in a box, but
we all have our strengths. It’s important to understand
that there are individuals who receive information
differently. As a leader, you must be able to respond to
these differences.
If you strictly communicate via email, it’s not going to
be successful, and it’s not going to create relationships
between each of your employees. Email is obviously a
central component of an organizational structure, so our
expectation is that every employee looks at their inbox
at least once every single day.
We use daily huddles in our core clinical laboratory,
microbiology laboratory, transfusion medicine
department and in our anatomic pathology department.
These huddles involve meeting as a small team for 5 to
10 minutes. If there are any concerns or issues that take
additional time, resources or commitment, these are set
Adapting to Change in Modern
Clinical Labs: An Interview
With Sean Tucker
Alex Beadle
Credit: iStock
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 17
aside for a conversation outside of the huddle.
In a 24/7 operation, staffing and staffing issues
are constant, and so we have had to adapt how we
communicate with individuals, because we can’t expect
them to check work emails at home. One way of
working around this is implementing group text options.
AB: What skills are essential for lab
leaders today, and what are the best
ways to develop them?
ST: You have to be able to be flexible, and you cannot
get discouraged when there are challenges ahead of you.
As a leader, you’re in the position to be able to support
individuals, but you’re not responsible for coming up
with every solution. We are scientists. We are highly
trained and capable individuals. Not everyone is able
to manage complex situations or projects on their own,
but that’s where leadership comes in. I think an integral
component of leadership is to allow everyone to seize
opportunities and help guide you as a leader.
I have used the DISC personality test and
StrengthsFinder, which are great to better understand
an individual’s strength components. It is also helpful to
find out what your leadership style is. I have found that a
servant leadership style has worked for me the best.
Scientists are quick to understand who is authentic and
who is not authentic, and so we have to be cautious
about being overconfident as we approach complex
situations. But, you have to be confident enough to
manage a project and make sure that it’s completed on
time.
AB: What does the Lab of the Future
mean for clinical labs?
ST: We have already seen the integration of automation
within our laboratories over the last several decades and
we are going to continue to see automation in clinical
and research laboratories.
In the United States, we are expecting approximately
30,000 retirements in the next few years, with about
10,000 graduates replacing them, so staffing shortages
are going to be one of the top two major issues in the
upcoming decades, and automation will likely be part of
the solution.
I imagine that in upcoming years we will see the
introduction of robotic phlebotomists (ultrasound
guided robotics that are experts in obtaining blood
through a venipuncture procedure). We’re also starting
to see the use of drones for laboratory specimen
transport, specifically for organ donation. People want
their health care delivered to them, so making things
adaptable and more convenient is going to be integral
moving forward.
Artificial intelligence (AI) is also going to help us
become more efficient. AI will not replace scientists,
pathologists or cytologists but it will allow us to be more
efficient and increase safety.
In the United States, hospitals are using different types
of systems that do not communicate with each other.
This means a patient’s record can be different from one
I imagine that in
upcoming years we will
see the introduction of
robotic phlebotomists
(ultrasound guided
robotics that are
experts in obtaining
blood through
a venipuncture
procedure).
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 18
state to the next. I think we’re going to be able to use AI
to be able to mine that data from their previous hospital.
I think right now, we’re too isolated. By integrating data
and optimizing AI to help us mine patient data, we’ll
start to see some really positive outcomes in the next 10
to 20 years.
AB: What advice would you have for
early career scientists who are aspiring to
become lab leaders themselves?
ST: One thing I have always done is try to put myself
in an uncomfortable situation. Too often we don’t
detour from our comfortable positions to explore new
opportunities. Try to detour as often as you possibly can
within reason, both in your professional experience as
well as in your personal life.
I have received some of the best leadership advice
through my experience as a husband and father, and
learning what communication works well. It doesn’t
have to always be in the laboratory where you develop
your leadership skills. You have to be genuine, truthful
and as transparent as you possibly can be.
Whether it’s through travel, meeting friends or
dating, put yourself into situations that make you a
little uncomfortable and see where that pushes you.
I guarantee you’re going to learn from that, and then
you’re going to be able to take these stories and skill sets
into leadership.
AB: What are the biggest challenges
facing laboratories today, and how are
you looking at addressing these moving
forward?
ST: Staffing and employment will continue to be a
challenge. For the last 20 years, I have been going into
schools and talking about my career to make sure that
kids understand that it’s not just a nursing degree or
a physician’s degree that are the only opportunities in
healthcare. There is a whole world of health sciences
that you can participate in. Once you get to the college
level, it becomes challenging to change your degree, so
you have to reach them at a younger age.
The other challenge is how we reimburse testing. We
have to be able to cover the cost per test, and employees,
their benefits and electricity all have to be included in
the cost analysis to see if we can afford to perform the
test in our laboratories. What we get reimbursed drives
what testing we can perform. We have to advocate to
our government structures for the gold standard tests to
help support patient outcomes, showing that investing
in the laboratory can have incredible outcomes on long
term patient healthcare impacts.
One example that we’re doing here currently in the
laboratory is looking at looking at new microbiology
identification technology that allows us to diagnose
an infection sooner, so that we can get patients on the
appropriate antibiotic therapy and discharge them from
the hospital as quickly as possible, safely. The quicker
you can discharge a patient from the hospital safely,
the sooner your opportunity to have a new bed open is
available, and in the United States, we get reimbursed
based upon patient stay.
Managing those challenges with reimbursement is kind
of the next thing that keeps me up at night.
ABOUT THE INTERVIEWEE:
Sean Tucker is the director of Laboratory Services at North Kansas
City Hospital. He is also the Chair-Elect of the Council of Laboratory
Management and Administration at the American Society for
Clinical Pathology, where he was also awarded the “Forty under
40” award.
19 LAB OF THE FUTURE
Artificial intelligence (AI) is advancing scientific
research, transforming traditional methods and opening
new avenues for discovery. From accelerating material
development to providing insights into cancer risk, AI’s
capabilities extend across diverse fields.
As AI continues to evolve, its role in research is
advancing the pace of discovery and setting the stage
for innovations that were previously thought to be years
away. Here, we explore recent research utilizing AI.
AI Speeds Up the Discovery of
Quantum and Energy Materials
In a study published in Advanced Materials, researchers
report on the development of an AI model that predicts
a materials’ optical properties across a wide range of
light frequencies.
Understanding the optical properties of materials is
essential for developing optoelectronic devices, such
as LEDs, devices that are pivotal in the semiconductor
industry’s current resurgence. However, traditional
means of calculation prevent the rapid testing of a large
number of materials.
The new AI model is able to predict the optical
properties of a material with the same accuracy as
quantum simulations using only crystal structures. This
makes it suitable for a wide variety of applications, such
as screening materials for high-performance solar cells
and detecting quantum materials.
“Machine-learning models utilized for optical prediction
are called graph neural networks (GNNs),” said Ryotaro
Okabe, a chemistry graduate student at MIT. “GNNs
provide a natural representation of molecules and
materials by representing atoms as graph nodes and
interatomic bonds as graph edges.”
How AI Is Revolutionizing
Scientific Research
Kate Robinson
Credit: iStock
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 20
AI Reveals Sex-Specific Risks
Associated With Brain Tumors
By examining how brain tumors differ between men and
women using AI, researchers found distinct patterns
in tumor incidence, progression and response to
treatment.
Glioblastoma is one of the most aggressive forms
of cancer, with a median survival of 15 months after
diagnosis. However, pinpointing the characteristics that
might help doctors forecast which tumors are likely to
grow more quickly has previously proven elusive.
In the study, researchers leveraged the computational
power of AI models to probe large volumes of medical
images and find patterns. The AI model was trained to
recognize the unique characteristics of tumors using
data from more than 250 studies of glioblastoma
patients. The model was then trained to identify any
patterns between these characteristics and patients’
survival time while accounting for their sex.
“There’s a ton of data collected in a cancer patient’s
journey,” said Pallavi Tiwari, an associate professor
in the Department of Radiology at the University of
Wisconsin. “Right now, unfortunately, it’s usually
studied in a siloed fashion, and this is where AI has huge
potential.”
AI Helps Decode Infant Behavior
In a study published in Scientific Reports, researchers
used AI to reveal that foot movements are crucial
for understanding how infants connect with their
environment.
The baby-mobile experiment uses a colorful mobile
gently tethered to an infant’s foot. When the baby kicks,
the mobile moves, linking their actions to what they see.
This setup helps researchers understand how infants
control their movements and discover their ability to
influence their surroundings.
In the study, researchers tracked infant movements with
a 3D capture system and applied various AI techniques
to examine which methods best captured the nuances
of infant behavior across different situations and how
movements evolved over time.
Both machine and deep learning methods accurately
classified clips of infant movements as belonging to
different stages of the experiment. Foot movements
showed the highest change and accuracy rates.
“Adults can follow instructions and explain their actions,
while infants cannot. That’s where AI can help. AI
can help researchers analyze subtle changes in infant
movements, and even their stillness, to give us insights
into how they think and learn, even before they can
speak. Their movements can also help us make sense
of the vast degree of individual variation that occurs as
infants develop,” said Nancy Aaron Jones, a professor
in the Department of Psychology at Florida Atlantic
University.
AI Model ‘SPOT’ Accurately
Predicts Substrate Transport
An AI model has demonstrated impressive accuracy
in predicting substrate transport across cellular
membranes, a critical insight for drug development.
Transport proteins are responsible for the ongoing
movement of substrates into and out of a biological
cell and therefore determine the effectiveness of
drug delivery. To increase the ease of determining
which substrates a specific protein can transport,
bioinformaticians at Heinrich Heine University
Düsseldorf (HHU) developed the model, named SPOT,
to predict this with a high degree of accuracy.
The model was trained on 8,500 transporter-substrate
pairs converted into numerical vectors. This allowed
the vector for a new transporter and potentially suitable
substrates to be entered into the AI system. The model
was then used to predict the likelihood that certain
substrates will match the transporter.
Discussing the applications of this model, Dr. Martin
Lercher, a professor of computational cell biology at
HHU said: “In biotechnology, metabolic pathways
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 21
can be modified to enable the manufacture of specific
products such as biofuels. Or drugs can be tailored to
transporters to facilitate their entry into precisely those
cells in which they are meant to have an effect.”
AI Model Can Reveal the
Structures of Crystalline Materials
Research published in the Journal of the American
Chemical Society demonstrates the development of
an AI model that can improve the determination of
powdered crystal structures in X-ray crystallography.
Crystalline materials are made of lattices that consist
of many identical, repeating units. For materials that
exist only as powdered crystal, solving these structures
becomes much more difficult because the fragments
don’t carry the full 3D structure of the original crystal.
The new model, called Crystalyze, was trained on
simulated X-ray diffraction patterns for thousands of
materials produced by an existing AI model. Crystalyze
determines the size and shape of the lattice and which
atoms will go into it. It then predicts the arrangement
of atoms within the lattice. For each diffraction pattern,
the model generates several possible structures, which
can be tested by feeding the structures into a model that
determines diffraction patterns for a given structure.
When tested, the model was accurate about 67% of
the time and able to determine structures for over 100
previously unsolved patterns.
“Our model is generative AI, meaning that it generates
something that it hasn’t seen before, and that allows
us to generate several different guesses,” said Eric
Riesel, MIT graduate student. “We can make a hundred
guesses, and then we can predict what the powder
pattern should look like for our guesses. And then if the
input looks exactly like the output, then we know we got
it right.”
AI and
the Lab of
the Future
58%
66%
23%
14%
17%
42%
48%
28%
30%
8%
28%
Unstructured data
Data silos/no access data
FAIRification of data
Inability to share/transfer analytical methods
Ontology management
Lack of metadata standardization
Lack of infrastructure and tooling
Resistance to data sharing/collaboration
Lack of knowledge and expertize
Lack of data governance within organizations
Other
The Challenges Facing AI in the Lab of the Future
A Pistoia Alliance report looked at what their life science leaders thought were the greatest barriers to their
adoption of AI and other lab digitization. The graph below shows the percentage of respondents that identified
issues as one of the top three obstacles.
Unstructured Data
Even if data is available to train and power AI
systems, it still has to be in an appropriate
format for the AI to parse. Messy datasets, like
unclear handwritten notes, compromise the
usefulness of tools like NLP.
Data Silos/No Access to Data
Data silos and access problems are common
within bulky company structures. Maximizing
AI’s potential requires different sources
of data to be made available for training
and analysis; cracking open these silos is
essential work.
Resistance to Data Sharing and
Collaboration
Data systems can be upgraded overnight;
changing attitudes towards data sharing
can be a generational task. Academics
and researchers taught to preserve their IP
and afraid of being scooped can block the
availability of data for AI systems.
Lack of Metadata Standardization
AI systems are useful because they
analyze not just data, but also the data that
provides details about that data – metadata.
By investing in informatics solutions like
laboratory information management systems
(LIMS), metadata can be organized and
mandated into lab practices.
Click here to view the full infographic
23 LAB OF THE FUTURE
Scientists around the world are becoming increasingly
conscious of the environmental impact of their research,
however, the path to adopting sustainable practices can
be complex.
While some scientists may be uncertain about how to
initiate or prioritize changes to counteract this footprint,
others may not fully appreciate the broader advantages of
implementing such changes.
Challenges such as a lack of understanding, insufficient
accountability among staff, and a need for more
encouragement and support can hinder progress.
Addressing these barriers is key to fostering a culture
where sustainable practices are the norm – not the
exception.
In this article, we explore ways scientists can adapt
their day-to-day practices to reduce the environmental
footprint of their labs. We also highlight expert advice,
initiatives and case studies that support these changes
and showcase their effectiveness.
Practical steps towards a
sustainable lab
Laboratories typically use five to 10 times more energy
per square meter compared to office spaces – with fume
hoods and ultra-low-temperature freezers being two main
energy-intensive culprits commonly found in labs.
While scientists may not be aware of the energy and
water usage that their research warrants, they are likely to
notice their consumption of materials, such as single-use
plastics in biological labs. It is estimated that laboratories
worldwide generate approximately 5.5 million metric tons
of plastic waste each year – the equivalent of filling 67
cruise liners
Redefining Lab Practices To
Prioritize Sustainability
Laura Elizabeth Lansdowne
Credit: iStock
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 24
The University of Colorado Boulder (CU Boulder) Green
Labs Program, founded by Kathryn Ramirez-Aguilar in
2009, has found ways to effectively engage scientists and
lab personnel in sustainable practices.
“While I was working in labs, I began to wonder if my
research was truly helping more than it was hurting
because of the large resource consumption. For the
most part… solutions had not yet been developed to help
researchers take action to reduce the environmental
footprint of their research. I left the lab bench and gained
the support of CU Boulder to start the program to try to
address this issue,” said Ramirez-Aguilar.
The CU Boulder Green Labs Program aims to reduce
the consumption of multiple resources, including
energy, water, materials and hazardous chemicals in the
university’s laboratories. It also advocates for the efficient
and effective use of research equipment and lab space.
“Over the years, scientists have repeatedly expressed
interest in diverting their waste streams from the landfill,”
noted Ramirez-Aguilar. Given that this area has garnered
a lot of interest from researchers, it has fueled the demand
for eco-friendly plastic labware which some companies
are stepping up to try to address.
Since the beginning, the CU Boulder Green Labs
Program has also focused on enabling discussions with
scientists on other key topics related to efficiency,
including energy and water savings, and in more recent
years, efforts have included the benefits of sharing
equipment and the importance of optimized use of
laboratory space.
Sharing instruments saves resources
“When scientists choose to share research equipment
between labs then there are fewer instruments to
purchase and maintain with research funding, thus saving
researchers’ money. At the same time, less electricity
is consumed and it’s a more efficient use of lab space –
which is expensive space to build and particularly energyintensive because of ventilation needs,” said RamirezAguilar.
Figure 1: Key reasons why managed, shared research equipment benefits institutions. Adapted from a figure created
by CU Boulder. Credit: Technology Networks.
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 25
She continues: “If facilities with directors are set up to
manage the shared research equipment and users, then
there is also significant time savings (which also equates
to financial savings) and other benefits to be realized
by researchers because now there is a knowledgeable
director to help researchers with training and
troubleshooting problems.”
In 2018, CU Boulder launched the BioCore Facility
to streamline laboratory equipment sharing for three
science departments on the campus. The program
manages 90 shared pieces of equipment and over
60 researchers are utilizing the services across 18
laboratories. Since the program began, it has led to
savings of approximately USD 3 million, attributed to the
sharing and redistribution of resources.
Many other universities and research institutes have
implemented similar initiatives. For example, the
University of Cambridge’s Equipment Sharing Project
allows members of staff and students to share > 4000
items across various universities, including the University
of Oxford, Imperial College London, University College
London and the University of Southampton.
Identifying quick wins to cut
emissions
Lisa O’Fee, sustainability advisor at The Institute of
Cancer Research (ICR), reiterates the importance of
initiatives similar to those implemented at UC Boulder.
She is working to embed sustainability in everything the
ICR does, in its mission to “defeat cancer”. Launched
in December 2022, the ICR sustainability action plan
“Sustainable Discoveries” sets out how the ICR will
respond to the environmental crises we face such as
climate change and biodiversity loss. ICR is committed to
achieving net zero by 2040, with an interim reduction of
42% in carbon emissions by 2030.
She describes some “quick wins” to reduce energy and
carbon emissions: “Waste audits can identify if the
correct waste segregation process is being adhered to and
raise awareness. It’s important to look for opportunities
for improvement, for example, new recycling routes
and improved signage/training of staff. Sustainable
procurement training equips scientists with the
knowledge to select sustainable products – those that can
be reused, recycled, or are manufactured from recycled
content.”
Energy monitoring to identify pieces of equipment that
consume the highest amount of power is good to raise
awareness and reinforce switching off if applicable.
“Traffic light-coded switch-off stickers are also a good
way to prompt scientists,” noted O’Fee.
She highlights one of the two energy-intensive culprits
mentioned above – the ultra-low temperature freezer:
“Good practice in cold temperature storage is key to
reducing energy consumption within the lab.”
Programs like the International Laboratory Freezer
Challenge promote optimal use and upkeep of cold
storage equipment. This contest, organized by two
nonprofit entities – the International Institute for
Sustainable Laboratories (I2SL), where RamirezAguilar serves on the board, and My Green Lab – is free
to enter and is designed to encourage laboratories to
adopt best practices. Labs are scored on areas including:
preventative maintenance, materials management,
temperature tuning, retirements and upgrades, and
cutting-edge practices.
The 2023 Freezer Challenge, in which almost 2,000 labs
participated worldwide, resulted in an energy saving of
20.7 million kWh, equivalent to approximately 14,663
metric tons of CO2, which is over twice the CO2 savings
achieved in the previous year’s challenge. Based on
the United States Environmental Protection Agency’s
Greenhouse Gas Equivalencies Calculator, this amount
is equivalent to offsetting greenhouse gas emissions from
driving 36.9 million miles in an average gasoline-powered
passenger vehicle or the annual CO2 emissions from
2,854 homes’ electricity use.
Calculating carbon emissions
Laboratories consume a significant amount of energy, so
decreasing energy consumption can lead to proportional
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 26
reductions in CO2 emissions. But, considering the
diverse range of emissions associated with equipment,
consumables and supply chain activities (Figure 2), it can
be difficult to know how best to calculate, assess and alter
a lab’s carbon footprint.
“It is much easier to calculate carbon emissions from
Scope 1 and 2. Scope 3 emissions that are associated
with purchased goods and services are more difficult to
calculate and for the most part are calculated on a spend
basis. A hybrid methodology is a much more accurate
way to calculate emissions for the majority of Scope 3,”
explained O’Fee.
Resources, such as the Greenhouse Gas Protocol’s
Technical Guidance for Calculating Scope 3 Emissions,
are designed to help facilities evaluate their Scope 3
emissions.
Adopting a life cycle thinking
approach to lab equipment
While sharing and maintaining equipment is important,
when it’s time to replace instruments, it’s vital to consider
how energy-efficient they are. O’Fee believes researchers
shouldn’t be afraid to engage with equipment suppliers
and challenge them about the sustainability practices
associated with their products.
She offers the following advice: “You could put together
a sustainable supplier list and an accompanying
questionnaire for the suppliers, requesting information
on their carbon emission data, responsible procurement
policy, and if they have any procurement framework or
standard that they work to, for instance, EcoVardis or
ISO 20400. Increasingly suppliers of lab consumables
like plastics are giving information as to the type of plastic
and amount within a product.” Using the associated
emission factors, it is therefore possible to calculate the
amount of carbon emitted.
There are clear connections between resource use
efficiency in scientific research and cost savings or cost
Figure 2: Scope 1, 2 and 3 emissions. Credit: Technology Networks.
“If lab members are buying
more energy-efficient
equipment or conducting their
research in a way that reduces
the use of energy or water,
then there will be less energy
and water consumption for the
institutions to pay for,” noted
Ramirez-Aguilar.
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 27
avoidance. Financial benefits can be achieved for both
researchers and institutions.
“If lab members are buying more energy-efficient
equipment or conducting their research in a way that
reduces the use of energy or water, then there will be less
energy and water consumption for the institutions to pay
for,” noted Ramirez-Aguilar.
The CU Boulder Green Labs Program works with
scientists on financial incentives. Funding can be applied
towards the cost of research equipment purchases if
efficient equipment options exist and if lab members are
selecting efficient equipment.
Ramirez-Aguilar elaborates: “We have been able to
obtain funding to contribute toward the purchase of top
energy-efficient ultra-low temperature (ULT) freezers
and energy-efficient biosafety cabinets, the addition of
eco-modes to glove boxes and the purchase of waterless
condensers to use in chemical synthesis.”
Promoting sustainable practices
and sharing lessons learned
The insights and initiatives highlighted in this article
underscore a pivotal shift towards sustainable laboratory
practices. By embracing shared equipment, diligent
maintenance and strategic energy management,
laboratories not only reduce their carbon footprint
but also pave the way for a future where research is
synonymous with resource mindfulness.
The journey towards sustainability shouldn’t be focused
solely on reducing emissions or saving costs – it’s also
important to cultivate a mindset where every decision,
from equipment procurement to daily operations, is made
with sustainability at its core.
“We are not limited by the number of engaged laboratory
scientists that want to do more. After all, I have found that
researchers care and typically go into science because
they want to contribute to society or the world in a way
that will have a positive impact,” concluded RamirezAguilar.
USEFUL RESOURCES
1. Million Advocates for Sustainable Science
2. Freezer Challenge
3. International Institute for Sustainable Laboratories Best
Practices
4. International Institute for Sustainable Laboratories
Working Groups
5. International Institute for Sustainable Laboratories
Sustainable Lab Awards
6. My Green Lab
7. Greenhouse Gas Protocol Technical Guidance for
Calculating Scope 3 Emissions
ABOUT THE INTERVIEWEES:
Kathryn Ramirez-Aguilar completed her PhD in analytical
chemistry in 1999. She gained 15 years of research experience
before shifting her focus away from the bench, dedicating her
efforts toward enhancing the environmental sustainability of
scientific research and addressing its influence on climate change
more broadly. As well as managing the CU Green Labs Program
at the University of Colorado Boulder, she serves on the board of
the International Institute for Sustainable Laboratories (I2SL), acts
as chair of the I2SL University Alliance Group (UAG), and heads
the Bringing Efficiency to Research Grants initiative under the
I2SL UAG, aiming to integrate efficiency and sustainability into US
research funding.
Lisa O’Fee, a biochemist specializing in drug discovery with a focus
on oncology, has been part of the Institute of Cancer Research
(ICR) since 2013. In 2023, Lisa transitioned from the ICR’s Division
of Cancer Therapeutics to assume the role of sustainability
advisor at the institute. In her new capacity, she has been pivotal
in developing and implementing the institute’s sustainability
strategy. She coordinates several sustainability initiatives at the
ICR, including the freezer challenge and My Green Lab certification
for the laboratories.
28 LAB OF THE FUTURE
Artificial intelligence (AI) is continuing to expand
and develop across many industries, including the life
sciences. A survey conducted by the Pistoia Alliance at
its annual conference uncovered that most life sciences
experts acknowledge AI’s potential, but concerns remain
about its trustworthiness and implementation at scale.
Technology Networks spoke with Dr. Becky Upton,
president of the Pistoia Alliance, to learn more about the
key findings of the survey and their significance. In this
interview, Dr. Upton also discusses some of the main
hurdles preventing the wider adoption of AI in pharma
and how these challenges may be overcome.
Anna MacDonald (AM): Can you give
us an overview of the aims of the survey
conducted at the Pistoia Alliance
conference and the key findings?
Becky Upton (BU): The Pistoia Alliance conducted
its recent survey to gauge the extent of AI adoption in
the life sciences sector. Our conference was the perfect
place to conduct this research, gathering nearly 300
experts from across the pharmaceutical, technology and
regulatory fields.
Key findings revealed that while 70% of life sciences
experts acknowledge AI’s potential, many struggle
with its implementation at scale, largely due to issues
like data integrity and interoperability. The survey also
uncovered concerns about the trustworthiness of AI.
A notable number (63%) of respondents expressed
worries that poor data quality could lead to incorrect
conclusions and even potentially harmful clinical
decisions.
These insights underscore how urgent it is for more
industry-wide collaboration to address these challenges,
including by standardizing data practices and supporting
the ethical use of AI.
The Adoption of AI: Critical
Concerns in the Life Sciences
Anna MacDonald
Credit: iStock
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 29
The survey results are instrumental in shaping our
ongoing and future initiatives aimed at fostering
innovation through collaboration.
AM: Did any of the survey findings
surprise you?
BU: The survey findings also echo trends uncovered
by the Alliance’s Lab of the Future (LoF) research from
September 2023. The LoF report equally cited issues
related to data management, including once again that
low-quality and poorly curated datasets are the number
one barrier (58%) to implementing AI. Privacy and
security concerns around data were also raised as a
challenge by 34% of respondents in the report.
It’s no surprise to see these issues still persist more
than half a year later. While AI technology is advancing
rapidly – with iterations to consumer models like
ChatGPT dropping monthly – it will take longer for
the highly regulated life sciences industry to adopt AI
at scale. Working habits must change, and the industry
must ensure AI is being adopted ethically and safely,
with relevant regulatory guidance in place.
Ultimately, we are dealing with patients’ health, so
decisions about the technology’s use cannot be taken
lightly. This is why bodies like the Pistoia Alliance are so
important – to create a shared space to tackle challenges
related to emerging technologies together and to ensure
that collaboration delivers.
AM: What are the main hurdles that
are preventing wider adoption of AI in
pharma?
BU: The adoption of AI in pharma is hindered by
a combination of technical and cultural challenges.
Underpinning the successful application of AI is
the need to first establish a robust foundational
data backbone. Yet, data quality and management
clearly remain a significant barrier in the life sciences
industry. Many organizations struggle with data
silos, unstructured data and a lack of metadata
standardization, all of which prevent data from being
easily accessible and interoperable. This fractured data
environment slows down workflows and diminishes the
potential benefits of AI.
Cultural resistance to new approaches and to sharing
expertise is another hurdle. Many companies have
been using the same processes and toolsets for years,
so there is often pushback to shaking workflows up by
introducing AI, even if overall it would speed up R&D.
This is why it’s so important to build user-friendly tools
and for technology champions to share best practice
across their organization and the industry as a whole.
One final barrier is the lack of proven business cases for
AI adoption. Senior stakeholders often need tangible
evidence of AI’s benefits to justify further investment,
such as time saved or the number of new drug targets
identified. Without clear success stories, it’s challenging
to secure the necessary buy-in for further AI programs
from decision-makers and budget-holders.
AM: Can you tell us about some of
the projects that the Pistoia Alliance
has launched and how they will help
to address some of the challenges
highlighted?
BU: Since its inception, the Pistoia Alliance has
been fueling the successful adoption of emerging
technologies and acting as champions of science
through its portfolio of projects and communities. The
Alliance is committed to ensuring collaboration delivers
tangible outcomes – from producing best practice
guides to developing new data models and standards.
Here are just a few of our initiatives.
AI Community of Experts: This community provides
a safe space for organizations to share ideas and best
practices using our pre-competitive legal framework.
The community is looking to develop dedicated projects
in several important areas, including AI in regulatory
compliance, ethics in AI, information security for AI and
benchmarking for AI models and recently established a
new large language models in biological R&D project.
Identification of Medicinal Products (IDMP)
Ontology: The Alliance created a common data model
in collaboration with Bayer, Novartis, GSK, Roche,
Merck KGaA, Boehringer Ingelheim, Johnson &
Johnson, AstraZeneca, Amgen, AbbVie and Pfizer. This
ontology sets data standards to improve substance
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 30
identification, cross-border prescriptions, regulatory
process integration, supply-chain analytics and
pharmacovigilance.
Clinical Trial Environmental Impact Community: In
partnership with more than a dozen Pistoia Alliance
members from pharma, service and technology
providers and the Sustainable Health Coalition, Pistoia
Alliance is developing a standard methodology for
measuring the carbon footprint of clinical trials. This
will allow companies to quantify the environmental
impact of their clinical development programs against an
industry-wide standard, including an evaluation of the
impact of the use of decentralized components.
AM: How does the Alliance plan to
strengthen ties with global regulatory
agencies?
BU: The Alliance plans to deepen our involvement with
regulatory agencies in our projects and communities,
to decrease the amount of friction in regulatory
submissions. The partnership is mutually beneficial;
via the Alliance, regulators can speak to multiple
companies under one roof about new standards and
processes for companies to adopt. At the same time,
pharma companies can share their existing standards
and processes, allowing regulators to identify potential
issues and requirements accordingly. This collaborative
approach saves time, money and, crucially, speeds up
the delivery of new treatments to patients.
We have already seen success with this strategy, notably
in our work with the FDA on the In Vitro Pharmacology
Group and Global Substance Registration System
(GSRS) Consortia.
AM: 70% of the life science experts
surveyed recognize AI’s potential. Did
the survey uncover reasons why the
remaining 30% do not share these views?
Is the Alliance planning to investigate this
further?
BU: Of the remaining respondents, 13% said that
practical AI applications in the life sciences industry are
limited, saying the technology is too theoretical at the
moment. This further reiterates stakeholders needing
tangible evidence of AI’s benefits to justify further
investment.
On the other hand, 10% said that AI has already led
to a lot of breakthroughs in R&D. It’s true that some
areas have seen AI success, such as accelerating
existing workflows in small molecule discovery and lead
optimization for new drug candidates – shaving months
off time-intensive processes.
The final 7% said that only cutting-edge startups and
tech-led companies can use AI effectively, but we don’t
believe this is the case at the Alliance. Established
pharma companies possess substantial data science
expertise and a wealth of data that have the potential to
build robust AI models. The key lies in implementing the
right data standards, developing a clear plan for AI use
cases and actively collaborating with and learning from
the successes of other companies. Facilitating this level
of collaboration is exactly the reason why the Pistoia
Alliance exists.
ABOUT THE INTERVIEWEE:
Dr. Becky Upton was appointed as the Pistoia Alliance’s first
female President in June 2022. She is a long-time supporter of
pre-competitive collaboration in R&D and the critical role it plays
in advancing science. Becky currently leads the Pistoia Alliance’s
strategy, defining its future within areas of increasing importance
to the industry, such as data standards, emerging technologies,
diversity and inclusion, sustainability, and precision medicine.
Becky has worked across business development and scientific
services for companies including VWR (now part Avantor) and
Pion. She has a PhD in Biochemistry from Imperial College and an
MBA from Cranfield University.
31 LAB OF THE FUTURE
Most labs have them. Working analytical systems
purchased back in the mists of time but with obsolete
operating systems and/or instrument applications that
have not been updated since initial qualification and
validation. Welcome to your Museum of Analytical
Antiquities. We explore the reasons for this and suggest
ways to keep your analytical systems current and
compliant.
In this article we discuss why it is important to keep
current with software updates; it is complementary to
last year’s article on how to keep your LIMS current.1
Here, the focus is on analytical systems consisting of
an instrument controlled by an application installed
on a network or workstation for acquiring, processing,
reporting results and storing data. Typically (depending
on intended use), these are classified as United States
Pharmacopoeia (USP) <1058> Group C systems.2
We will look at the Museum of Analytical Antiquities
from two perspectives. The first is from the business
rationale and why software should be kept up to date,
and the second is from the regulatory expectations of
keeping current and meeting regulatory requirements.
These are your front-line analytical systems, and not
just those retained to read legacy data – although, that is
another discussion altogether.
The complete version of this article is available here.
Why do labs create a Museum?
In GMP-regulated laboratories, there is the requirement
to validate software to demonstrate that it is fit for its
intended use.3–7 This is perceived, incorrectly, as a major
bottleneck. So once validated, the company mantra is
DON’T TOUCH IT. The rationale is that revalidation
The Museum of Analytical
Antiquities
Paul Smith and Bob McDowall
Credit: iStock
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 32
is perceived as a major cost. This is not the case as
we will demonstrate in this article. This results in
laboratories failing to upgrade their application software
and operating systems as well as failing to fix known
cybersecurity flaws which could be exploited if, for
example, uncontrolled USB devices are used carelessly.8
Unlike a LIMS, which is a network solution, most
computerized analytical systems are designed for
standalone use and hence are not connected to the IT
infrastructure. Partly this is due to IT not wanting to
manage laboratory systems or laboratories not wanting
to be controlled by IT. Ideally, these systems should
be connected to the IT network at least for backup
and recovery, time synchronization and user account
management, etc. However, the usual situation is that
most sit in majestic isolation in your laboratory. This
provides some degree of protection from external
cybersecurity threats but introduces other competing
risks, like individual data backup/restore procedures
(especially if USB devices are used), as well as business
continuity, disaster recovery and data governance.
Instrument issues and
obsolescence
Hardware issues are ways of providing candidates
for the Museum as shown in Figure 1. If an analytical
instrument is obsoleted by a supplier, the availability
of replacement parts and consumables may prevent
the use of the instrument. If instrument firmware is
either not supported or updated, then software drivers
may not work. In addition, if an old computer system
fails, is the new replacement workstation able to run
an outdated operating system or a new instrument not
supported by the old application?
The focus of this article is on software updates and the
consequences of not being current.
Software = change
Unlike an analytical instrument with moving parts
software does not wear out. However, software is not
static but changes over time for a variety of reasons until
it reaches its end of life, as shown in Figure 1:
1. Application enhancements: Adding new features
or functions for useability and better performance
driven mainly by market forces.
2. Compliance enhancements: Incorporation of
new or updated GXP regulatory compliance, data
integrity or pharmacopoeial requirements.
3. Error fixes: Fixing software errors in the
application software through a specific hotfix
Instrument Workstation
Obsolete Operating
System
Application Can’t Run
On New Hardware
Instrument Obsoleted
Failure to Install
Security Patches
Lack of or Poor
Compliance Functions
Firmware Not
Updated
Application not
Updated Application Obsoleted
Instrument
Computer
Platform
Application
Failure to Implement
Error Fixes
Key
Obsolete Database &
Middleware
Software Drivers
Not Compatible
Parts and Supplies
Not Available
New Instrument Not
Supported by Old
Application
Figure 1: Areas for failing to update an analytical computerized system. Credit: R D McDowall.
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 33
or minor release and keeping compatibility with
technology.
4. Security fixes: Remediating cybersecurity
vulnerabilities of an operating system, software
component or application.
5. Outdated application or software: Obsolescence
or discontinuation of the application software
with no supplier support. Obsolescence or
discontinuation of an operating system or
middleware components with no supplier support or
a database has reached the maximum size the older
versions can support. Instrument firmware can also
be a reason for obsolescence, as failure to update
firmware can prevent interfacing to the application
software. Software drivers for the instrument may
not be compatible with a new application.
When faced with one or more of these situations, what
should a laboratory do? Please note, that the ostrich
option is not one of the choices here.
Update of software
As noted above updates of the software can occur for
one of the reasons given in the section above. Let us
take them in turn:
1. Application enhancements: Depending on the
business process being automated by the system
as well as the system’s intended use, some of
these features may or may not be of interest in a
release. It is important to read the release notes to
see what is new in each version. If an upgrade has
features that a laboratory wants to use, often it is
not installed as the validation is perceived as too
difficult or the perception that the whole system
must be revalidated. This will be discussed later in
this article.
2. Compliance enhancements: For example, when a
Pharmacopeia mandatory general chapter changes,
the requirements will be incorporated in the latest
version of the software. Therefore, the new version
is required to comply with the Pharmacopeial
procedure. If you don’t upgrade, how will you
comply? A spreadsheet, like the ostrich option, is
not an effective solution.
3. Error fixes: If a laboratory is impacted by a
software error, it is advisable to install the hotfix to
resolve the problem. There may also be application
updates to keep compatible with changes in other
system components.
4. Security fixes: Vulnerabilities in operating
systems, databases and middleware are identified
and patches are released to prevent exploitation by
malware. The advice is to keep current to prevent
loss of data.
5. Outdated application of software: Application
support isn’t just a tech support phone call to help
solve a problem or resolution only on a known bugonly support. It may also include the lack of a piece
of paper that says the vendor supports your system
and the inability of a vendor to provide hardware
qualification on a regular basis if the firmware/
software set is not compatible with their qualification
tool. This can be especially risky after instrument
preventative maintenance (PM) is performed.
An additional factor not listed above is the contractual
agreement with an application supplier.3, 9 The quality of
the software support you get, from training to helping
you answer questions during a regulatory inspection,
can depend on your risk management process, the
supplier knowledge, their QMS and the infrastructure
of the company. Here, a laboratory pays for ongoing
software support, and as part of the agreement, the
application must be kept on current or the previous
major release, otherwise support lapses or the company
must upgrade to the latest version before support is
provided.
Obsolescence of system
components
All components, including the instrument and the
computerized system, can be impacted by this. The
instrument supplier may decide that the instrument
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 34
and or the application has reached the end of life and
is declared obsolete. Ideally, the supplier will manage
this process well ahead of time through an EGS (end
of guaranteed support) framework. There may be
limited software support offered by a supplier, where
remediation of known bugs is the only option (for up
to two years) to give users the opportunity to move to
a new system. In some instances, the supplier may be
able to provide extended professional services such as
computerized system validation (CSV) to speed up the
implementation of updates or upgrades.
More important is that Windows 10 operating system
becomes obsolete in October 2025, after which there
will not be any updates unless you pay for them. This
option will last only for three years.
What are you going to do? Again, the ostrich option
is not very helpful. The scale of recent global
cybersecurity attacks has highlighted that some of
the most at-risk organizations are those managed by
governments – where Windows XP or other obsolete
operating system software may still be in use!
Regulatory expectations and
keeping current
The formal title of FDA manufacturing regulations is
Current Good Manufacturing Practice for Finished
Pharmaceutical Products or cGMP. The focus here
is on the word current. There is an explanation of the
term in preamble comment 17 in the Federal Register of
September 197810 where the GMPs were published, but
a better explanation is found on the FDA’s website11:
Accordingly, the “C” in CGMP stands for “current,”
requiring companies to use technologies and systems that
are up-to-date in order to comply with the regulations.
Systems and equipment that may have been “top-of-theline” to prevent contamination, mix-ups, and errors 10
or 20 years ago may be less than adequate by today’s
standards.
Note, the last sentence: obsolete systems may be less
than adequate by today’s standards.
Those working in Europe who do not make products
for the US market should stop laughing now as an
equivalent regulation is found in the European Directive
2001/83/EC Article 23 §112:
… the authorisation holder must, in respect of the methods
of manufacture and control provided for in Article 8(3)(d)
and (h), take account of scientific and technical progress
and introduce any changes that may be required to enable
the medicinal product to be manufactured and checked by
means of generally accepted scientific methods.
So, both FDA regulations and the EU directive state
that companies must keep current, and this includes
computerized systems. Therefore, how will laboratories
handle questions from an inspector that their systems
are not current or up to date? Two possibilities are that
the company policy of not upgrading the system or a
supplier’s support is inadequate. Here we can look at
periodic reviews of computerized systems.
Role of periodic reviews
To ensure that computerized systems remain in a
validated state, EU GMP Annex 11 clause 11 requires
periodic evaluations:
Computerised systems should be periodically evaluated to
confirm that they remain in a valid state and are compliant
with GMP. Such evaluations should include, where
appropriate, the current range of functionality, deviation
records, incidents, problems, upgrade history, performance,
reliability, security and validation status reports.3
Whilst not explicitly stated, obsolescence of software
components is a part of a periodic review as a system
not maintained by software updates may have security,
backup and recovery flaws and data may be vulnerable
to loss from malware.
In the unlikely event that there have been several
software changes during the period covered by the
review, there should be an assessment of the cumulative
impact of the changes on the system.
However, PIC/S PI-041 in section 9.3.5 as part of a
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 35
periodic review states that data integrity limitations
of software should be addressed in a timely manner.13
Given the extremely close alignment of PIC/S and EU
GMP, this may be an addition to the draft Annex 11
update although it is not mentioned explicitly in the
2022 proposal.14
Equally so, as hardware ages, the bathtub curve shows
that the failure rate increases and the system and the
data it holds becomes more vulnerable. Increasingly,
instrument and application suppliers can provide
some level of instrument monitoring services, so that
instruments that break down more are prioritized for
replacement. Following Sod’s Law, any such failure
will always come at the worst possible time. Therefore,
one role of a periodic review should highlight to
management the risks posed by obsolete systems and
simplified through a monitoring service.
The example GMP regulatory citations for obsolete
systems will be found in the complete article
Role of the supplier
A key support role for any supplier of analytical systems
is to provide accurate and reliable release notes for each
version or hotfix that they issue, e.g., new features and
bug fixes. The aim is to inform users of changes and their
impact on the system: e.g., is a change localized or does
it impact the entire system? This information permits
any laboratory whether to upgrade or not and also the
level of revalidation required from nothing to a full
system revalidation. The process owner is accountable
for the decision and extent of validation in conjunction
with the validation team and QA. Typically, for most
laboratory computerized systems, these updates are
not frequent unless there is a critical software bug that
requires urgent resolution.
This is unlike a SaaS LIMS where software is upgraded
every 3–4 months whether you like it or not as this is in
your supplier contract.1
Other topics discussed in detail in the complete article are:
• Application upgrades
• Application replacement
• Operating system obsolescence
• Does your old app run on Windows 11?
• Revalidating the system
Summary
As noted in the summary of the LIMS article, it is much
easier to have small incremental upgrades which are easier
to handle, rather than major projects if left too long.1
The
same applies to computerized analytical systems.
Many citations for older systems focus on their failure
to have audit trail functionality. With the update of
Annex 11, audit trail functionality will not be a regulatory
expectation but mandatory. To avoid potential technical
and regulatory issues, start upgrading the systems in
your Museum of Analytical Antiquities NOW.
In October 2025 Windows 10 will be obsolete. Many
laboratory systems run on this operating system – what
will you and your supplier do? This appears as déjà vu
all over again as there are still systems with Windows
XP and Windows 7 that are currently entries in the
Museum.
Will your Museum of Analytical Antiquities increase in size?
ACKNOWLEDGEMENTS
We would like to thank Mahboubeh Lotfinia, Mike Korbel,
Simon Mitchell and Steffan Nielsen for their review and
comments in preparation of this article.
REFERENCES
1. McDowall RD. How To Future-Proof Your LIMS: Handling
Software Updates. Technology Networks. https://www.
technologynetworks.com/tn/how-to-guides/how-tofuture-proof-your-lims-handling-software-updates-370757.
Published March 3, 2023. Accessed September 5, 2024.
2. United States Pharmacopeia. General Chapter, –1058–
Analytical Instrument Qualification. USP-NF. Rockville, MD:
United States Pharmacopeia. Published 2024.
3. European Commission. EudraLex - Volume 4 Good
Manufacturing Practice (GMP) Guidelines, Annex 11
Computerised Systems. European Commission: Brussels.
Published June 30, 2011.
4. Food and Drug Administration. FDA Guidance for Industry
General Principles of Software Validation. 2002,: Rockville,
MD.
5. 21 CFR Part 11; Electronic Records; Electronic Signatures
TECHNOLOGYNETWORKS.COM
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 36
Final Rule. Federal Register, 1997. 62(54): p. 13430 - 13466.
6. Good Automated Manufacturing Practice (GAMP) Guide
5, Second Edition. 2022, Tampa, FL: International Society
of Pharmaceutical Engineering.
7. GAMP Good Practice Guide A Risk Based Approach
to GXP Compliant Laboratory Computerised Systems,
Second Edition 2012, Tampa, FL: International Society for
Pharmaceutical Engineering.
8. RD McDowall. Consigning SneakerNet to the Graveyard of
Technology. Spectroscopy. 2021. 36(4): p. 14-17.
9. EudraLex - Volume 4 Good Manufacturing Practice
(GMP) Guidelines, Chapter 7 Outsourced Activities. 2013,
European Commission: Brussels.
10. 21 CFR 211 - Current Good Manufacturing Practice for
Finished Pharmaceuticals, in Federal Register. 1978. p.
45014 - 45089.
11. Facts About the Current Good Manufacturing
Practices (CGMPs). FDA. https://www.fda.gov/drugs/
pharmaceutical-quality-resources/facts-about-currentgood-manufacturing-practices-cgmp. Published 2021.
Accessed September 5, 2024.
12. 12. European Union Directive 2001/83/EC on Medicinal
Products for Human Use. 2001, European Commission:
Brussels.
13. IC/S PI-041 Good Practices for Data Management and
Integrity in Regulated GMP / GDP Environments. 2021,
Pharmaceutical Inspection Convention / Pharmaceutical
Inspection Cooperation Scheme: Geneva.
14. Concept Paper on the Revision of Annex 11 of the
Guidelines on Good Manufacturing Practice for
Medicinal Products – Computerised Systems. 2022;
Available from: https://www.ema.europa.eu/en/
documents/regulatory-procedural-guideline/conceptpaper-revision-annex-11-guidelines-good-manufacturingpractice-medicinal-products_en.pdf.
LAB OF THE FUTURE: TRENDS AND TECHNOLOGIES 37
TECHNOLOGYNETWORKS.COM
CONTRIBUTORS
Alex Beadle
Alexander is a science writer and editor
for Technology Networks. He writes news and
features for the Applied Sciences section,
leading the site’s coverage of topics relating to
materials science and engineering. He holds a
master’s degree in Materials Chemistry from the
University of St Andrews, Scotland.
Anna MacDonald
Anna is a senior science editor at Technology
Networks. She holds a first-class honors degree
in biological sciences from the University of East
Anglia. Before joining Technology Networks she
helped organize scientific conferences.
Bob McDowall
Bob is an analytical chemist with over 50
years’ experience and has been involved with
process redesign, specifying and implementing
laboratory informatics solutions for over 40
years and has nearly 35 years’ experience of
computerized system validation in regulated
GXP environments.
Kate Robinson
Kate Robinson is a science editor for Technology
Networks. She joined the team in 2021 after
obtaining a bachelor’s degree in biomedical
sciences.
Laura Lansdowne
Laura is the managing editor at Technology
Networks, she holds a first-class honors degree
in biology. Before her move into scientific
publishing, Laura worked at the Wellcome
Sanger Institute and GW Pharma.
Neil Versel
Neil is a healthcare and life sciences journalist,
specializing in bioinformatics, information
technology, genomics, patient safety, healthcare
quality and health policy. Versel has been covering
healthcare since 2000, across a wide range of
publications in the US, Canada and Europe.
Paul Smith
Paul is a laboratory compliance consultant
helping regulated laboratories understand
instrument qualification, data integrity and the
latest regulatory trends.
RJ Mackenzie
RJ is a freelance science writer based in Glasgow.
He covers biological and biomedical science, with a
focus on the complexities and curiosities of the brain
and emerging AI technologies. RJ was a science writer
at Technology Networks for six years, where he also
worked on the site’s SEO and editorial AI strategies.
Sponsored by
Download the eBook for FREE Now!
Information you provide will be shared with the sponsors for this content. Technology Networks or its sponsors may contact you to offer you content or products based on your interest in this topic. You may opt-out at any time.
Experiencing issues viewing the form? Click here to access an alternate version