We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Overcoming the Data Analysis Bottleneck in High-throughput Chemistry

Overcoming the Data Analysis Bottleneck in High-throughput Chemistry content piece image
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 3 minutes

Whilst some aspects of biochemistry research have become increasingly high-throughput, others remain stubbornly time-consuming; our ability to generate and collect chemical data has accelerated rapidly, but subsequent analytical processes are too often laborious and slow. ACD/Labs, the Toronto-based small molecule chemistry software provider, say that their new Katalyst D2D program can open up this analytical bottleneck. We talked to Andrew Anderson, Vice President of Innovation and Informatics Strategy at ACD/Labs, about how the field is being changed by automation and how Katalyst hopes to speed up pharmaceutical development.

Ruairi Mackenzie (RM): Why is analytical chemistry a bottleneck for High-throughput experimentation? 

Andrew Anderson (AA): Analytical chemistry, specifically characterization of materials generated, is a stubborn bottleneck in productive high-throughput (HT) experimentation. While chemical synthesis can be carried out in high density (96-, 384-, or 1536-parallel reactions), analysis is often still relegated to a one-by-one process. Productivity gains of HT synthesis are lost when a reaction array must be sampled, analyzed, and the results reviewed one-by-one. For those that may have managed to install a high throughput LC/MS analysis system, the problem of relating results back to the originating well in the array still exists. Scientists have told us that they spend hours looking through paper traces of chromatograms and mass spectra, making notes as to which data set belongs with each well in a 96-well array. Basing a decision about the success of a particular set of reaction conditions, for example, from the analytical results is both painstaking and laborious.

RM: How has analytical chemistry changed in terms of automation in the last five years?

AA: We have seen an explosion in the volume and variety of automated sampling and robotics interfaces. This explosion, coupled to the increasing volume of data files generated by increasingly accurate mass spectrometers, results in a significant data management challenge. Additionally, the sheer volume of experiments executed in HT laboratories creates a strain on their supporting analytical facilities. Furthermore, the reliance on open access systems (meaning chemists are expected to collect analytical data themselves, versus a traditional submission to a core analysis facility), subjects users to analytical challenges with less support by core analytical chemistry staff. Finally, the increasing pressures of corporate laboratories to outsource certain functions creates a “digital collaboration” challenge; whereby remote colleagues must find effective ways to innovate together—but with the added challenge of doing so across continents and time zones.

RM: Why have you developed Katalyst D2D? How will it help analytical researchers?

AA: Coming from the world of pharmaceutical development myself, I have seen many of the traditional challenges I faced continue to be an issue today—multiple systems, data transcription, and the fear of introducing errors all contribute to reduced productivity in pharmaceutical development. Considering that the industry needs to bring life-changing drugs to market without delay, I am inspired to help our customers with their productivity challenges.

Pharmaceutical R&D organizations have invested in high throughput, automation, and informatics assets to increase overall productivity for many years. Unfortunately, these assets are not integrated from initial experimental design through to decision-making, resulting in a significant impediment to overall organizational productivity. So the time that could be spent in moving the project forward is spent on administrative tasks (copy/pasting and data entry) between the various systems being used for high throughput experimentation, or because such systems don’t exist (as was the case for HT analysis).

Katalyst D2D not only provides design and planning of high throughput and automated experiments; it also facilitates execution by integrating to high-throughput systems (dispensing equipment, analysis equipment, etc.).

Katalyst is unique in its ability to bring analytical results back into the system from analytical instruments and automatically connect the data to each individual experiment in a HT/parallel array. This not only removes the bottleneck of data analysis but also facilitates effective decision-making based on that data—eliminating tedious manual connection and review of analytical results and helping decisions to be made faster, with greater confidence. Most importantly, Katalyst D2D enables this all in a single software interface.

RM: How have you overcome the challenge of uniting disparate hardware and software under one automated workflow, and the associated compatibility issues? 

AA: At ACD/Labs we have a >20-year history of digitizing and uniting data from disparate hardware and software systems into a common platform. ACD/Labs has long-standing relationships with all the major analytical instrument vendors and uniquely offers software that homogenize multi-technique analytical data in our vendor agnostic Spectrus platform (with support for >150 data formats).

With Katalyst D2D, we expanded our experience to include the variety of other hardware and software typically in use. Importantly, since some procedural steps of HT experiments may still require manual effort (e.g., weighing and preparation of reagent stock solutions) the software can generate both human-readable and machine-readable procedures.

Andrew Anderson was speaking to Ruairi J Mackenzie, Science Writer for Technology Networks