Miniature Columns, Massive Insights: The Transformative Potential of Automation for Purification
Miniature Columns, Massive Insights: The Transformative Potential of Automation for Purification
Complete the form below and we will email you a PDF version of "Miniature Columns, Massive Insights: The Transformative Potential of Automation for Purification"
When it comes to biopharmaceutical manufacturing, the process defines the product. Optimizing a bioprocess, however, is a remarkably complex and challenging task.
Cultured cells, for example, are often used to create biopharmaceutical products and are sensitive to a large number of variables associated with bioreactor processes. Downstream impurities must be removed from the cell culture supernatant while maintaining the yield of the protein or viral vector of interest. In addition, these cellular mixtures containing the desired biopharmaceuticals are usually highly complex and heterogeneous, requiring multiple separation steps. To complicate matters further, linking the variety of data formats and data sources generated from these processes and deriving insights from them is a beast of its own.
While processes like optimizing protein purification are critical to the efficacy, safety and success of a biopharmaceutical product, they are typically inefficient and arduous, thereby contributing to the extensive drug development timelines that have become normalized within the industry.
Fortunately, there is light at the end of the tunnel – a novel combination of miniaturized purification technology, automation and software presents an opportunity to drastically shorten drug or vaccine development timelines by rethinking how purification processes are refined.
Miniaturized purification makes process iterations cheaper, faster and more flexible
With the rising investment in protein engineering and lentiviral vector purification, cost-effective miniaturized purification columns have been increasingly adopted by the biopharmaceutical industry over the past decade. Miniature column capacities range between 50 and 600 µL, which reduces the required sample volumes and, in turn, the consumption of expensive source material. Consequently, more experiments can be performed at a lower cost.
Miniaturization also enables parallelization, affording greater flexibility and shorter development timelines. Multiple parameters can be tweaked at once, thereby expanding assay design space. The effects of these different parameters on protein purity can be determined much sooner than would be possible with traditional approaches.
Dr Arne Vandenbroucke, senior automation engineer at Synthace, highlights further benefits of miniaturization:
“You can work closer to the upstream process, especially if that process is also miniaturized. You can build a system where you connect your miniaturized bioreactor to your miniaturized purification system to enable combined process optimization.”
This holistic approach to process development has been explored as a way to keep track of dependencies between upstream and downstream processing and is a dramatic contrast to siloed approaches of the past. Upstream processing, downstream processing and downstream analytics were typically considered somewhat separately, making end-to-end process optimization a very slow process.
Recognizing and removing technical barriers to automation and miniaturization
Despite the advances in automated liquid handling technology and the availability of miniaturized purification columns, a technical barrier to automation has meant that this streamlined approach has, so far, been underutilized.
While many scientists in biotech and pharmaceutical companies will be familiar with the sight of an automated liquid handler dispensing a buffer through its steel tips into miniaturized purification columns, they will also be familiar with the difficulty that lies in editing the scripts used to program these robots. Support “wizards” can certainly help with script creation; however, these aids have their limitations.
It is not uncommon for a biologist to manually edit their scripts, which requires them to read the script line by line and identify where parameters should be tweaked. With this approach, it is difficult to confirm the feasibility of the final script and whether it carries the correct instructions for the liquid handler. Due to this limited flexibility in experimental designs, only a small number of parameters are typically studied at once, slowing down the bioprocess and limiting the assay design space.
Another major challenge lies in keeping track of experimental context, an important and mandatory aspect of process development. Traditional automation software has poor traceability features and does not make it easy for biologists to collect and manage the large amounts of metadata associated with each experiment, such as the date, time, user, robot configuration, log and method files and history of the resins and reagents.
Metadata collection is not only important for archiving purposes – it can also provide critical insights into the context of purification results and help provide an explanation as to why or how an event or pattern emerged. However, as collating and retrieving such large amounts of information has traditionally been an arduous or near impossible task, this metadata is often overlooked.
Software for controlling bioprocess equipment often expects programming experience and, according to Dr Vandenbroucke, performing purification with such tools can be a burden to biologists who have not had time to learn computer coding. There is a clear gap between the current hardware and purification technology and the end user being able to capitalize on all the potential benefits of miniaturization.
Flexible programming platforms are needed to reimagine bioprocess optimization
With the right software support, biologists have the freedom to easily design and track purification experiments for bioprocess optimization. Powerful software features that should be considered a baseline for usability include:
- Cloud-based infrastructure independent of operating system
- A visual interface that is intuitive for biologists and process engineers
- Flexibility that enables sophisticated experiments without complex scripting
- Integration with liquid handling robots that gives the end user fine control of the parameters that matter
- Simulation capabilities that enable experimental previews
- Automated collection and structuring of data
- Straightforward data export processes
5 benefits of modern cloud-based bioprocessing infrastructure
The above software features enable miniaturized purification and automation to be harnessed efficiently, allowing for a new approach to biological experiments. Here, we dive deeper into the top five benefits of modern cloud-based bioprocessing software.
1. Iterate process designs in the cloud from the comfort of your desk, before going to the laboratory
In silico simulations allow biologists to assess an experiment’s feasibility and iterate assay designs before executing these in their laboratory. Using this approach, specific parameters can be checked (for example, the flow rate of an elution buffer), alongside the order of operations, and time and resources required. Biologists will also appreciate being able to carry out the simulation and experiment independently, without relying on an automation engineer.
The ability to design, plan and optimize experiments in silico is incredibly beneficial in current circumstances, where laboratory access is limited due to remote working policies. Scientists can prepare their protocols at home and share them with colleagues working on site via the cloud, thereby maintaining team productivity and avoiding stalled projects.
2. Deal with a user-friendly interface
“As humans, we find it much easier to process a visual representation than to read a script line-by-line”, says Dr Vandenbroucke.
Modern automation software platforms feature intuitive interfaces with visual representations which make it easy to instruct liquid handlers and export data.
3. Bring back randomization and standardize techniques
Throughout the stages of bioprocess optimization, there are many opportunities for systematic or accidental errors to occur. With the right software and automation strategy, these errors can be avoided.
For example, biologists may practice non-random habits to help align their data. For instance, experimental controls may always be placed in the same locations on a plate, which can hide systematic errors such as edge effects. In contrast, randomization can be implemented easily when everything is tracked and aligned by the right software and automation support.
Automated liquid handlers also allow for a more standardized pipetting technique compared to human operators, while data alignment is far less error-prone when the manual process of shuffling data around an Excel® spreadsheet is removed.
4. Manage data easily and effortlessly with thorough traceability
Without an automated approach, the management of bioprocessing data is a monstrous and cumbersome task. The data must be collected, stored, retrieved and analyzed – and all of this potentially required over many years.
Modern cloud platforms comprise many useful features to help scientists overcome these challenges, facilitating far easier data handling. For example, data from an analytical device is automatically aligned with sample liquid handling steps and experimental inputs, so that biologists can easily identify the factors which generate the best yields of purified proteins of interest.
The ideal platform will have streamlined processes for the collection and storage of metadata, enabling easy access at any time. With the large amounts of data collected over the course of a bioprocessing project, traceability is a key challenge that is now being overcome by automated data and metadata collection.
5. Optimize your bioprocess with greater efficiency
Automated miniaturized purification can be a powerful aspect of high-throughput process development (HTPD), when backed by software designed to support such experiments. The potential time savings are enormous, when all efficiency gains discussed above are considered.
Miniaturization allows more variables to be studied at once, and when combined with automation and the right software enables faster:
- Feedback (helped by rapid robocolumn-based workflows and interactive pseudo-chromatograms which provide an immediate sense of the purification results)
- Iterations of the bioprocess (which is further accelerated by simulation capabilities)
- Data extraction and interpretation.
Compared with traditional systems, cloud-based platforms and modern software packages can be updated without interrupting the user’s workflow.
Software design is key to harnessing the power of automated miniaturized purification to create better therapeutics
To date, advances in miniaturized purification and automation capabilities have been underutilized, largely due to the technical barrier and inconveniences presented by available software packages. However, life scientists can take heart that engineers and scientists are actively working together to develop software that overcomes these barriers and assists biologists every step of the way.
Platforms like Synthace’s Antha will enable scientists to perform more sophisticated, reliable and time-efficient experiments with their chromatography columns, realizing their full potential to enhance biotherapeutics purification process development.
This will in turn translate into better biotherapeutics and higher yields, with shorter development timelines. Hopefully, these improved, life-saving therapies will enter scale-up manufacturing sooner and become accessible to patients faster.
Special thanks to Dr Arne Vandenbroucke, senior automation engineer at Synthace, for providing insights for this article.