Laboratory Informatics; Climbing the Slope of Enlightenment?
Article Aug 12, 2013
During the recent European Lab Automation 2013 (ELA 2013) conference in Hamburg, Germany, Technology Network's Informatics Editor Helen Gillespie sat down with John Trigg, Director of PhaseFour Informatics, to discuss the integrated lab. Trigg's presentation during the conference covered the concept of the e-Laboratory, or integrated laboratory. He stated that the e-Laboratory gets ever closer, particularly with electronic laboratory notebooks replacing their paper counterparts. However, a number of challenges remain in order to fully exploit the potential that digital technologies present to assisting and extending the course of science knowledge. This article addresses these issues as well as what happens after electronic lab notebook (ELN) implementation.
TN: During your ELA 2013 presentation, you pointed out that there is a difference between lab automation and lab informatics. Can you explain the difference for TN readers?
Trigg: There’s no hard and fast rule about this, and sometimes there’s a certain amount of ambiguity in the way the terms Lab Automation and Lab Informatics are used. I like to distinguish between them on the basis that Laboratory Automation refers to the use of technologies to streamline or substitute manual manipulation of equipment and processes, whereas Laboratory Informatics generally refers to the application of IT systems to the handling of laboratory data and information as well as the optimization of laboratory operations. In other words, Laboratory Automation deals largely with the real-time control of equipment and the acquisition and processing of data, whereas Laboratory Informatics focuses on the on-going management of laboratory data and information--typically by multi-user systems such as LIMS, SDMS and ELNs.
TN: How is the increasing complexity of science affecting the trend toward convergence?
Trigg: The boundaries between the different scientific disciplines have been getting increasingly blurred for a number of years. This has been particularly evident in the Life Science industries, and increasingly so in other domains. As we probe deeper and deeper in developing our understanding of our existing and future products, processes and services, each of the scientific disciplines has a part to play. It is necessary to take a holistic, multi-disciplinary view not only of the purpose and function of the product, process, or service, but also of its impact on health, safety and the environment. In the laboratory, this generates a demand on our automation and informatics tools to be integrated and used across all of the disciplines in order to correlate all data relating to the product life cycle. It also requires the tools to provide the means to interrogate, analyze and visualize the data to help develop better scientific understanding and better decision-making. Operating in discipline-specific silos is not a feasible option anymore.
TN: Most labs create a comprehensive Functional Requirements Specification (FRS) prior to selecting an ELN. But what happens after the choice has been made and implemented? What impacts can the lab expect form their ELN implementation?
Trigg: I like to think in terms of three specific aspects of the Functional Requirements Specification in the ELN selection process; the first of these is the underlying business requirement – what are the high level objectives, that is what’s the problem we’re trying to solve, and how is an ELN going to solve it? Typically included at this level are costs (how much are we prepared to pay to solve the problem and what’s the ROI?); what factors do we need to consider in terms of regulatory compliance, legal and corporate governance?; and, what IT constraints will apply.
The second level of requirements relates to defining the specific operational functions that will enable us to meet the business need; i.e., the system functionality that will enable us to solve the problem. The third aspect is the user requirements, which primarily address ease of use and usefulness, in other words, how the ‘system’ should work to ensure user acceptance. Although it is normal to include user requirements within the functional requirements, there is a benefit in keeping them logically separate in order to ensure that ‘ease of use’ and ‘usefulness’ are criteria that are judged by users, rather than by the project team.
Getting the FRS right is the primary focus in the selection process, but then the implementation strategy becomes the critical factor. Rolling out an ELN will to some extent, impact the way people work. It will represent a cultural and operational change in the laboratory that may not be universally welcomed. Nevertheless, the outcomes–assuming a successful deployment–will almost certainly bring productivity benefits, primarily through the elimination of unproductive, paper-manipulation processes, but also through better opportunities for sharing data and information.
TN: What kind of benefits do most labs receive as a result of ELN implementation? Are these actually realized or is there a high point of failure?
Trigg: Let’s deal with failure first. It’s always difficult for anyone to own up to failure, and in the main there are probably hardly any catastrophic failures in ELN deployments. However, there is always a risk that something can go wrong, but in practice, most issues that arise tend to be change management concerns. Getting user acceptance is a key success factor for almost every ELN project, and as such, the ‘people’ issue is one that usually gets a lot attention during deployment.
On the benefit’s side, most ELN deployments claim productivity gains in the region of10%-20%. However, this tells nothing about the quality of the science. There is often a good deal of anecdotal evidence about time savings, just by having easy access to the accumulated information in the ELN database. Being able to search the database and connect with colleagues/co-workers is considerably easier than in the world of paper notebooks.
Centralizing the notebooks in a system also makes it easy to extract management reports. This means that a number of simple metrics about laboratory performance can be accessed, such as the number of experiments, experiment durations, time taken to witness and sign, etc. As a consequence, throughput becomes a key metric, with a strong emphasis on laboratory productivity. But as lab automation and lab informatics take us closer to an asymptotic limit in terms of productivity, we should be looking to the tools to enable us to deliver better science. It is quite common to find ‘soft’ objectives associated with an ELN requirements specification, such as ‘improving knowledge management (KM)’. Unfortunately it is a challenge to get any quantification in terms of KM performance; as far as KM is concerned, technology is a big part of the problem, but a small part of the solution.
TN: How about integration issues - what are the biggest obstacles the lab faces when integrating ELNs with the rest of the lab's electronic infrastructure?
Trigg: For a lot of labs, the introduction of an ELN represents a significant step towards becoming an all-electronic or paperless lab. The term ‘paperless’ can be a bit misleading, since the real objective for most labs is to become a fully integrated environment in which modern tools and technologies are deployed to improve lab efficiency by the provision of seamless integration of systems and searchable repositories of data of proven integrity, authenticity and reliability. The major obstacles to achieving this can be attributed to two areas; firstly culture, and secondly technology. The culture issue is largely one of change management; taking away a paper lab notebook from scientists and replacing it with an electronic tool is not always a popular move, despite the clear benefits to the business in doing so. There has always been a strong sense of personal ownership associated with the traditional paper notebook, whereas an ELN advocates a sense of openness and sharing as beneficial criteria. Getting users through the change process will always be a one of the first barriers to be overcome.
In terms of technology, there are two factors to take into account. Firstly, technologies continue to evolve at a much faster rate than our businesses can easily accommodate. Consumer technologies are often significantly more advanced than the technologies we use in our labs, often for good reasons. The constraints placed on laboratories by business requirements and regulatory compliance will always drive the need for proven, stable and robust technologies; we cannot afford to take risks. However, our personal experiences of consumer technologies, particularly in the sphere of communications, sharing and collaboration are often far in advance of what we experience in the lab, and this can cause some frustration with the functional limitations within the lab’s IT infrastructure.
The second area is that of connectivity. The opportunity that ELNs, LIMS or SDMS serve as the ‘informatics hub’ in a fully integrated environment is somewhat challenged by the lack of laboratory data interchange and communication standards. This issue has been debated for years, and continues to be debated without any universal progress being made. It’s a complex issue, and one line of thought is that despite the various initiatives undertaken over the years to resolve the issue, it’s probably too late to find a universal solution. The industry is dependent on third party integrators, custom code or single-vendor solutions to address the needs. This is in sharp contrast to the trend we see in consumer technologies, and there is much we can learn, particularly from the way social tools address communication and collaboration that would facilitate better integration of laboratory tools in support of knowledge management objectives.
HPC in Research: Analysing more data, more quicklyArticle
We chat to Simon Burbidge, Director of Advanced Computing at the University of Bristol about their new 15,000 core, 600 Teraflop HPC system.READ MORE
Software Introductions at ASMS 2017 Focus on Workflow and Automated AnalysisArticle
The introduction of powerful, but user-friendly, software designed to help ease workflow and help automate data analysis at ASMS’s annual meeting highlighted the importance of workflow solutions beyond instrumentation and consumables.READ MORE
Cloud-based Research Informatics: Improving Collaboration, Increasing Agility and Reducing Operating CostsArticle
With cloud adoption significantly enhancing collaborative projects, increasing operational agility and lowering total cost of ownership, cloud computing has become a valuable and viable solution.READ MORE
Comments | 0 ADD COMMENT
EMBL Course: Introduction to Next Generation Sequencing
Apr 09 - Apr 12, 2018
EMBL Course: RNA Sequencing Library Preparation - How Low Can You Go?
Mar 19 - Mar 23, 2018
EMBL Course: Analysis and Integration of Transcriptome and Proteome Data
Mar 12 - Mar 16, 2018
EMBO | EMBL Symposium: Tissue Self-Organisation: Challenging the Systems
Mar 11 - Mar 14, 2018
EMBL Course: Bioinformatics Resources for Protein Biology
Mar 06 - Mar 08, 2018