The Lab of the Future: Artificial Intelligence, Machine Learning, and Microfluidics
Article Nov 27, 2017 | by Laura Elizabeth Mason, Science Editor, Technology Networks
Today’s laboratories look very different to ten years ago, but what about ten years from now?
The way in which scientist’s approach research has been drastically influenced by the advancement of technologies. The integration of robotics and automation has revolutionized procedures, transforming tedious manual processes to automated liquid-handling systems.
The introduction of microfluidics and lab-on-a-chip technologies, adoption of paperless workflows, and the ever-increasing interest in cloud computing, machine learning, and artificial intelligence (AI) are just a few factors instrumental in the transformation of the laboratory. More specifically, influencing procedure efficiency, reproducibility, data collection, analysis and sharing, and much more.
Charles Fracchia, founder and CEO of BioBright presents a TEDx talk on ‘Smart Laboratory Tools of the Future’.
Microfluidics and lab-on-a-chip technologies
Microfluidics enables the manipulation and analysis of extremely small fluid volumes within a multichannel system (10–9 to 10–18 litre). The capacity to downsize large-scale biology coupled with the capability of housing multiple experiments on a single chip, which is small enough to fit in the palm of your hand, is an attractive concept.
The ability of microfluidics to miniaturize multiple laboratory operations, is reflected in its use across a multitude of scientific fields, including the areas of genomics, materials science, molecular biology, and organic chemistry. There are several advantages to microfluidic technologies. Firstly, very little sample is required, meaning precious or scarce samples can be conserved. The volume of reagents is also significantly reduced compared to traditional large-scale analyses, translating to an overall cost saving. Another benefit is that the miniaturized system can still achieve high-resolution analysis, whilst maintaining sensitivity. The nature of the channels (high surface-to-volume ratio) means that reagents rapidly diffuse into the reaction chamber, reducing the time it takes to complete a reaction, generating results faster. Finally, lab-on-chip systems can be automated and standardized meaning there is little need for human intervention, eliminating the risk of ‘human error’.1,2 This is particularly beneficial considering the continued evolution of robotics and automation within laboratories.
Automation and robotics
To merely say ‘technology has changed the way we conduct research’ is an understatement— it has truly transformed it.
Robotics has enabled significant restructuring of the workplace. Many areas traditionally reserved for ‘wet lab’ benchwork have been reallocated to liquid handling robots that are capable of processing hundreds, if not thousands, of samples, through the use of pre-programmed and customizable procedures, requiring limited supervision. Such automation allows researchers to invest more time on data analysis. An important point to note is that the implementation of robotic instruments does require ‘fit for purpose’ workflows and data storage solutions to ensure one’s ability to sort, organize and access the data. This challenge has been a major focus for cloud-based data storage, AI and machine learning developers over that last few years.
“Our capability as humans to generate vast volumes of data and to look to find innovation out of that data is unprecedented in history, and this presents us with some extraordinary challenges from an informatics perspective, in terms of what we do. How do we manage this data? Analyze this data?” Christian Marcazzo, Vice President, Informatics, EMEA & India, PerkinElmer, Inc.
Cloud computing, artificial intelligence, and machine learning
Cloud computing provides researchers easy but secure access to their data that multiple research groups can access data in real time, broadening collaborative capabilities across the world.
“The emergence and progression of cloud computing, that fact that more and more computing that we do isn’t happening inside our data centers… but is happening on the web, and the capability and potential that comes from these cloud computing capabilities, can really transform our ability to handle large volumes of data.” Christian Marcazzo, Vice President, Informatics, EMEA & India, PerkinElmer, Inc.
Technological capabilities are continually evolving, fundamentally impacting the way research is conducted. Cloud computing has made collaboration easier than ever before. The need for more efficient data handling systems has translated in to a surge in the development of machine learning and AI solutions.3 Strong collaborative effort and investment is currently being applied to AI research.4
“The field of artificial intelligence has experienced incredible growth and progress over the past decade. Yet today’s AI systems, as remarkable as they are, will require new innovations to tackle increasingly difficult real-world problems to improve our work and lives,” John Kelly III, IBM Senior Vice President, Cognitive Solutions and Research, speaking in a recent press release.
Machine learning allows you to more thoroughly interrogate both unstructured and structured data through self-learned algorithms. The key is to access as much data as possible. By incorporating text analytics and machine learning into research workflows it is now much easier to scrutinize data, drive hypotheses forward, and establish future research direction.
Microfluidic technologies have transformed the way experiments are conducted, reducing both scale and cost. Cloud computing, AI, and machine learning have now made it far easier to access, share and analyze data. When it comes to laboratory evolution, great strides have been made over the past decade, and further technological advancements are sure to bring us even closer to a fully automated ‘intelligent lab of the future’.
Case Study: Improve your lab’s efficiency and data compliance with a chromatography data systemArticle
In this Case Study, Peter Zipfell of Thermo Fisher Scientific explains how Broughton Laboratories, an analytical laboratory that supports method development and validation projects across pharma, implemented and upgraded a chromatography data system (CDS), increasing lab efficiency and improving data verification processes for overall time savings.READ MORE
The Quest for Exascale: Have the Goal Posts Changed in HPC?Article
The High-Performance Computing community has in recent years been dominated by the quest for Exascale systems, capable of a billion billion calculations per second. However, the interest in artificial intelligence and machine learning capabilities has pushed exascale off the front of stage. In this article, David Yip, HPC and Storage Business Development Manager at OCF, asks whether the quest for exascale is truly over.READ MORE
One Year on, Scientists Defend Canada’s Anti-Genetic Discrimination LawArticle
The Canadian Genetic Non-Discrimination Act (GNA), introduced last May, made it illegal to require individuals to disclose genetic test results or to compel individuals to undergo genetic tests for any agreement or service. However, a subsequent legal challenge has prompted a robust defense of the act from scientists in the May issue of the Canadian Medical Association Journal (CMAJ).READ MORE