A New Year: 2013 Conferences Promise Detailed Look at Laboratory Trends and Directions
Article Feb 02, 2013
A new year, a new set of spring conferences with the opportunity to learn the latest details about new technology, new methodology, new directions for the lab. The year kicks off with the Society for Laboratory Automation and Screening (SLAS) conference in Orlando, FL from January 12-16, segues to the Molecular Medicine Tri-Con 2013 conference in San Francisco, CA from February 11-15, quickly followed by the Pittsburgh Conference, PITTCON 2013, in Philadelphia, PA from March 17-21. Each conference presents a different side of Informatics and a different focus on LIMS.
This year the SLAS 2013 and Tri-Con 2013 conferences host some very interesting Informatics sessions, in particular the sessions on Externalization and the Changing Landscape in Pharma. The chairperson for the upcoming Tri-Con 2013 session on externalization is Michael Elliot, CEO of Atrium Research & Consulting, a noted industry expert in Electronic Laboratory Notebooks (ELNs) and Informatics.
Technology Network's Editor Helen Gillespie asked Mr. Elliot to expand upon his premised that there is a de-evolution of informatics occurring and how he thinks externalization is changing the informatics landscape.
TN: Can you expand upon what is meant by externalization and how you think it is affecting the lab?
Elliot: Externalization is the pursuit of a fully or partially virtualized research model integrating partner companies with specific capabilities. It is a sea change impacting traditional R&D processes.
Historically, the majority of research investment was on internally focused activities; while development was a mix between internal and external spending. Overall biopharma R&D spending is flat, while external spending is increasing greater than 20% per year, leading to a situation where within 3-5 years, over half of discovery spending will be on outside services. Some might be contractors that provide services at a lower cost than can be achieved internally, others have scientific skills or technologies, while other situations tightly integrate partners with joint intellectual property ownership rights. There are a few organizations and/or project teams that have fully virtualized; partners within the network fulfill all the laboratory required tasks coordinated by a very small team.
This impacts the traditional model in many ways. Companies have undertaken significant – almost Herculean - efforts to move to project team-based research to break down departmental barriers and to enhance collaboration in the last ten years. With externalization, you cannot just go down the hall to ask a team member a question, meet them in a conference room, or have them hand you the data you need at the water cooler. You have to work with people outside your organization who are spread across the globe - most of them you have never met. This is changing the collaboration process people have worked so hard to put in place.
Systems and processes that have been designed to support internal research, e.g. chemical registration, ELN, LIMS, bioassay data management system, may not support the architectural and security requirements of externalization. While internal processes were optimized for expedited results delivery, putting that work outside radically changes them, sometimes for the worse. For example, pharmacokinetic screening typically takes less than a week from compound delivery to results posting to a data warehouse and project priorities can be changed mid-stream if needed. This activity externalized can take two weeks or more due to shipping logistics and the complexity of results transfer. While an analysis could be at a lower cost per test, the cycle time is actually longer. Data transfer can be complicated by the fact the partner might have their own data management solution and formats, requiring semantic and format conversions. We know of organizations who had pretty sophisticated integrated internal environment that now receive PDFs, Excel files, and even paper from contractors – a devolution of their informatics infrastructure.
TN: What is the benefit to the lab to pursue externalization?
Elliot: Externalization is primarily for the benefit of the organization as a whole, rather than just a specific laboratory. Early on, the targeted benefits were primarily financial – companies wanted to lower the costs of routine operations such as library synthesis and in vitro screening. Rather than have large fixed overhead, costs become variable with the ebb and flow of projects.
In past couple of years, we are seeing a push to tighter relationships with shared risk-based milestones. Companies are even pursuing crowd-sourcing approaches for new molecular entities and problem solving. The idea is that through a broader, more diverse community of experts, innovation will be stimulated leading to an increase in the number of pipeline candidates.
TN: Does Big Data play a part in externalization?
Elliot: Honestly, I am not a fan of “big data” as a discrete concept. What is considered “big data” now will be “small data” next year. It is relative. If you consolidate hundreds of small datasets into one, isn’t it still just “data”?
That being said, we are seeing larger and larger datasets such as what is generated from next generation sequencing. NGS data has to be distributed across a virtualized research community, particularly for translational medicine. The desire by a few pharmaceuticals for complete DNA sequence of early stage clinical trial participants will create many petabytes of data needing to be transmitted, stored, and analyze across the globe. There are a number of tools that are common in the broader IT community that facilitate large data movement. Several major pharmaceutical companies are exploring their applicability as well as the use of the cloud for storage and for high performance computing.
TN: Is externalization specific to the biotech/pharma industry or are other labs taking advantage of the trend?
Elliot: As with many initiatives of this type, biopharma is taking a lead position, but we feel it will inevitably impact all industries that invest heavily in R&D. For example, take China, where the number of PhDs graduating annually in science and engineering has surpassed both the US and Europe. While once we saw many of these graduates work in the west and stay, a trend is forming for many to move back and start their own laboratories.
Companies must not only find ways to tap into this growing pool of expertise, but also need to tune products to local requirements as consumer demand increases. This will necessitate an overall increase in global research networks.
TN: What are some of the key components of externalization that a Lab Manager should consider?
Elliot: The biggest mistake we see is that in the pursuit of project milestones, teams form relationships with partners and contractors without the involvement of IT and other operational groups. These groups are often surprised and are ask to expeditiously integrate the contractors’ data. Most often the lab does not appreciate the complexity of the problem from security, access, transmission, logistics, and data conversion points of view. Trying to convert data from twenty different contractors – each with their own vocabularies and formats – is quite daunting. It is not unheard of that by the time IT resolves all the issues, they had been contractor tried, dropped, and replaced with someone else. The growing number of laboratory contract brokers has exasperated this problem.
My advice is to evaluate a potential relationship not just from a scientific perspective, but consider everything that goes around it. Involve IT early and often into the discussions to develop a strategy for data integration. If material (e.g. drug substance) is to be transferred between locations, involve compound management and the customs team. Develop a supportable financial model and the metrics for measuring performance. In the end, the total costs and process delays might be larger than you think, outweighing the targeted benefits.
Listen to Mr. Elliott and other industry leaders at the upcoming conferences.
HPC in Research: Analysing more data, more quicklyArticle
We chat to Simon Burbidge, Director of Advanced Computing at the University of Bristol about their new 15,000 core, 600 Teraflop HPC system.READ MORE
Software Introductions at ASMS 2017 Focus on Workflow and Automated AnalysisArticle
The introduction of powerful, but user-friendly, software designed to help ease workflow and help automate data analysis at ASMS’s annual meeting highlighted the importance of workflow solutions beyond instrumentation and consumables.READ MORE
Cloud-based Research Informatics: Improving Collaboration, Increasing Agility and Reducing Operating CostsArticle
With cloud adoption significantly enhancing collaborative projects, increasing operational agility and lowering total cost of ownership, cloud computing has become a valuable and viable solution.READ MORE
Comments | 0 ADD COMMENT
EMBL Course: Introduction to Next Generation Sequencing
Apr 09 - Apr 12, 2018
EMBL Course: RNA Sequencing Library Preparation - How Low Can You Go?
Mar 19 - Mar 23, 2018
EMBL Course: Analysis and Integration of Transcriptome and Proteome Data
Mar 12 - Mar 16, 2018