The Transformative Power of Insights in Biopharma Development: Why a Digital Backbone Matters
Discover how a digital backbone can help to realize the full potential of data and generate meaningful insights.
Complete the form below to unlock access to ALL audio articles.
The industry as a whole is on the digital transformation path, but individual organizations often implement fragmented software tools that mimic manual processes in a digital format rather than rethinking their objective in a digital paradigm. This results in disparate and disjointed digital systems, often requiring manual intervention to move data across them, nullifying the purpose of the implemented tool to support a digital transformation strategy. Globally adopted initiatives combining scientific potential with business needs are key to truly digitally transforming a business.
Fundamental to delivering a drug to market is the full story of how a drug was designed, developed, validated, tested, packaged and released. Digital tools can facilitate this, but can the associated data be reused, re-purposed and leveraged to drive innovation? While data still live in dead-end repositories, is the industry able to truly harness the power of the large volumes of data it creates?
A myriad of experiments, instruments and reports
Figure 1: From laboratory to patient - key development stages and their associated documentation required to take a drug to market. Credit: Courtesy of IDBS.
When reviewing even just the pharmaceutical development section of the IND document (Section 3.2.P.2 Pharmaceutical Development),3 the number of systems, instruments, experiments and reports (Table 1) that come together to demonstrate the effectiveness of the process development (PD) stage of drug development cannot be underestimated.
Table 1: Non-exhaustive list of example reports compiled within the process development of biological therapeutics.
Process and product understanding required
Cell line history
· Genealogy of cell line
· Conditions under which cell line is developed
· Performance and stability of cell line clones
· Genealogy of chosen cell line cultures
· Steps and conditions under which cultures are grown
· Reagents, consumables and equipment
· Culture performance, product expression and product yield
· Genealogy of process intermediates generated from chosen culture
· Steps and conditions under which material is purified
· Reagents, consumables and equipment
· Purity and desired drug composition and formation
· Qualitative and quantitative methods used for analysis
· Reagents, consumables and equipment
· Process sampling step and relevant sample properties
· Process sampling step and relevant sample results
Exceptions and errors
· Errors encountered during development and processing
· Errors encountered during analytical development
· Intentional and unplanned deviations and reasons
According to the guidelines, studies are expected to provide the “basis for process improvement, process validation, continuous process verification…and any process control requirements”4 by detailing:
- Robust unit operation pathways
- Process and critical parameters for each unit operation
- Cell line, product and process-related impurities
- Qualified analytical methods suitable for this process
- Compliance and acceptability of materials, consumables and equipment
This is a huge ask. With the demand on time and the variety and volume of data at each step, utilizing digital tools to enable early decision-making is highly advantageous. How quickly scientists can gather these insights has a bearing on the robustness of process design for process scale-up and the tightening of development timelines.
Finding the needle in the haystack
Within relevant teams, data may need to be spliced and diced based on different objects, experimental conditions and scales, projects of the same cell lines, versions of the molecule used for different targets, etc. If all this information lives in diverse systems and platforms, how do scientists get the process understanding they’re looking for?
Often, an industry-agnostic tool, such as MS Excel, is used to compile and analyze these data; it provides a platform to collate data easily but requires significant human intervention and skill to sanitize, format, consolidate, pivot and chart the data. The collation itself is often manual and after the fact (i.e., non-contemporaneous) too, via copying and pasting from disparate systems.
Analyzing large volumes of data also presents performance challenges. With PD organizations now actively using high throughput instruments, such as miniature parallel bioreactors, the amount of data generated from even one experiment can require significant manipulation and analysis. 24 parallel mini bioreactors running over a 14-day period generate process monitoring data every 30–60 sec – with sampling performed twice a day across cell health and metabolite analyses, this can lead to hundreds of thousands of data points! Regardless of in-built data historians and proprietary control software, scientists need to combine execution information with parameter and result data while also assessing the performance and health of the cell culture during the run, ultimately, in an external system.
Work smarter, not harder
This is where a digital backbone can prove invaluable. By automatically assimilating data from an instrument, with process execution data to give this data true context, and without requiring manual facilitation, structured and contextualized data are easily available for analysis. Ensuring data are correctly constructed, accessible and exchangeable allows analysis to be performed promptly and data to be visualized with industry-relevant data analysis tools.
Furthermore, the use of robust digital data collection strategies gives the opportunity to reuse these data and apply advanced analytics tools, such as artificial intelligence and machine learning, and in-silico modeling, such as digital twins, to optimize biopharma development. According to the CEO of Novartis, Vas Narasimhan, it can often take “years just to clean the datasets,” and it can prove difficult to “clean and link the data.”5 Having a digital backbone built to improve connectivity and eliminate silos can significantly reduce this time burden.
Returning to the miniature parallel bioreactors, often used in design of experiment (DoE) activities, establishing a digital data backbone allows high-volume process and product data to be combined to generate more powerful insights automatically. From understanding and optimizing processes to reducing waste and determining the robustness of a designed process, this can shave hours to weeks off a scientist’s time and allow early analysis of ongoing processes to enable data-led strategic decisions. DoE studies are an integral part of PD, providing “justification for establishing ranges of incoming component quality, equipment parameters and in-process material quality attributes” in line with current good manufacturing practices.6
The rise of biological capabilities, such as cell and gene therapies and antibody–drug conjugates, are emphasizing the need for digital acceleration. Owing to their complexities and long turnaround times, legacy tools do not serve growing development needs. The requirement for a central, accessible repository that facilitates the surfacing and exchange of data easily at product and data hand-off points is critical.
The average cost of getting a drug to market is over $2 billion7 and it is estimated up to 90% of drugs fail in the clinic.8 Addressing the challenges associated with developability, clinical efficacy and toxicity earlier in the drug development process could increase the likelihood of therapeutics to succeed in the clinic and speed up time to market. To do so, the industry needs to gain the maximum potential from its available data to answer poignant process-relevant questions earlier.
With high-quality data in structured and contextualized formats and bidirectional traceability, a digital backbone realizes the full potential of the data and generates meaningful insights by leveraging advanced analytics tools. It underpins the journey from digital enablement to digital transformation, facilitating early strategic decision-making to generate high-yielding, high-quality products and ultimately get therapies to patients faster.
About the author:
Unjulie Bhanot is the head of BPLM (biopharma lifecycle management) solutions, and part of the IDBS strategy team based in the UK. With over 10 years of experience in the biopharma informatics space, she now owns the strategy, development and delivery of IDBS Polar solutions.
Prior to joining IDBS, Unjulie worked as an R&D scientist at both Lonza Biologics and UCB and received a BSc in Biochemistry and MSc in Immunology from Imperial College London.
1. Kudumala A, Konersmann T, Israel A, Miranda W. Biopharma digital transformation: Gain an edge with leapfrog digital innovation. Deloitte Insights. https://www2.deloitte.com/us/en/insights/industry/life-sciences/biopharma-digital-transformation.html. Published December 8, 2021. Accessed September 27, 2023.
2. US Food and Drug Administration. The Comprehensive Table of Contents Headings and Hierarchy (Version 2.3.3). https://www.fda.gov/media/76444/download. Published November 9, 2020. Accessed September 27, 2023.
3. Committee for Medicinal Products for Human Use. ICH guideline M4 (R4) on common technical document (CTD) for the registration of pharmaceuticals for human use – organisation of CTD. European Medicines Agency. https://www.ema.europa.eu/en/documents/scientific-guideline/ich-guideline-m4-r4-common-technical-document-ctd-registration-pharmaceuticals-human-use_en.pdf. Published March 19, 2021. Accessed September 27, 2023.
4. European Medicines Agency. Note for guidance on pharmaceutical development. https://www.ema.europa.eu/en/documents/scientific-guideline/note-guidance-pharmaceutical-development_en.pdf. Published June 2009. Accessed September 27, 2023.
5. Shaywitz D. Novartis CEO Who Wanted To Bring Tech Into Pharma Now Explains Why It’s So Hard. Forbes. https://www.forbes.com/sites/davidshaywitz/2019/01/16/novartis-ceo-who-wanted-to-bring-tech-into-pharma-now-explains-why-its-so-hard/#5ac5288d7fc4. Published January 16, 2019. Accessed September 27, 2023.
6. US Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research, Center for Biologics Evaluation and Research, Center for Veterinary Medicine. Process Validation: General Principles and Practices. https://www.fda.gov/files/drugs/published/Process-Validation--General-Principles-and-Practices.pdf. Published January 2011. Accessed September 27, 2023.
7. Deloitte. Pharma R&D return on investment falls in post-pandemic market. https://www2.deloitte.com/uk/en/pages/press-releases/articles/pharma-r-d-return-on-investment-falls-in-post-pandemic-market.html. Published January 9, 2023. Accessed September 27, 2023.
8. Sun D, Gao W, Hu H, Zhou S. Why 90% of clinical drug development fails and how to improve it? Acta Pharm Sin B. 2022;12(7):3049–3062. doi: 10.1016/j.apsb.2022.02.002.