Make Data Work for You in the Race To Manufacture Effective Vaccines
Make Data Work for You in the Race To Manufacture Effective Vaccines
The world continues to look to the scientific community for solutions to the COVID-19 pandemic. Whether you’re a contract development and manufacturing organization (CDMO) or a vaccine manufacturer, demand for novel therapeutics is higher than ever in today’s crisis. Companies without a clear data strategy will likely at some point block a therapy from entering the market. It’s imperative to have good data management practices for scientists and leaders in all departments to ensure quality data and safety, while keeping production costs down and reducing time-to-market now and in the future.
Current government mandates such as social distancing and stay-at-home orders mean that companies are decreasing their on-site staff to 50% and so working at reduced capacity, reducing productivity at the bench. Companies actively working to find an effective therapy against the SARS-CoV-2 virus must strike a balance between continuing their work and keeping employees safe. Many organizations have implemented shift-working for scientists who need to go to the lab, while most other employees work from home; although some areas have begun reopening, threats of second waves of COVID-19 continue to make remote work desirable for some organizations.
If the vaccine and outsourcing markets were busy before, it’s nothing compared to the pressure today. But with less staff coming into the lab, organizations must prioritize their projects – putting non-essential projects on hold, taking on only the most critical ones and completing any ongoing projects to stick as close as possible to the original timeline. This highlights the need for a streamlined approach to data management, so that patients can get the medication they need as quickly as possible.
The key for CDMOs and vaccine manufacturers to save time and costs is streamlined workflows, accuracy and safety. Manufacturers need to rethink how departments capture, manage and store their data without compromising on quality to keep vaccine testing and production costs down while reducing the time to market.
In this article, we’ll explore how an effective digital strategy can help vaccine manufacturers and CDMOs get their products to market faster. We discuss how the outbreak has presented an opening for companies to take a step back, re-evaluate their priorities and take advantage of the down-time to learn about new technologies and gain insight on the complex challenges of the industry.
Traditional data capture can jeopardize product safety
In the lab, pen and paper have been the go-to tools to record experiment data for generations, and they were satisfactory while the amount of data generated was manageable. Scientists working to develop a vaccine today are under pressure to produce novel products faster, and with continuous streams of data from batch records, experiment parameters, quality controls and calibration records from numerous instruments, pen and paper are simply inadequate.
Manually tracking data in Excel spreadsheets and transferring data between multiple systems creates room for errors, putting data integrity at risk. For example, a Wall Street trader may have written a ‘b’ for billion instead of ‘m’ for million, causing the biggest drop in the midday stock market in history. Imagine how this kind of error could affect biologics. Already, biopharmaceuticals are much more fragile than their chemical counterparts – the slightest variation in environmental conditions, batch materials, or equipment used during the manufacturing process can have a significant effect on the quality and robustness of the biopharmaceutical, making it unsafe for public use. For example, a study showed that Cetuximab, a monoclonal antibody used to treat metastatic colorectal cancer, caused severe hypersensitivity in patients. This adverse effect was put down to a manufacturing variation where a specific type of glycan attached to the protein structure.
Poor data management is costly in vaccine manufacturing
Misplacing a single digit in a logbook can create a ripple effect down the production line. The 1-10-100 rule on data quality, developed by George Labovitz and Yu Sang Chang, highlights the cost of correction: it takes $1 to prevent an error, $10 to fix the issue and $100 for each piece of data left uncorrected. The data is bound to be used by other departments such as bioanalytical testing and the downstream team. The longer the error goes undetected, the worse the consequences. Finding and reporting the error can take two weeks or more. If the quality team gets involved, it can set back a company by two months, utilizing resources and people to fix the mistake and prevent another occurrence. Valuable time is wasted, costing money and market access.
Moreover, having data in different locations renders it inaccessible, and can prevent effective communication between teams. For instance, the upstream team is unable to see information about product attributes and share it with the formulations team. Without those process insights, the latter cannot identify adverse trends and move proactively to prevent batch failures. In fact, poor data quality, missing data and redundant data can cost a company 15-20% of their operating budget due to delays and re-work.
Taking advantage of this time to revamp data strategies
Manufacturers need to upgrade their data management to suit their current needs. Amid a pandemic, this new way of working gives labs the opportunity to compare what they’re getting from their data, and what they’d like to get from it.
With fewer new projects and more time at a computer, scientists have a chance to analyze their data and re-think the way they manage it. They can take a step back, re-evaluate their strategies, formulate a plan and wrap up their ongoing experiments from home.
Scientists working to complete experiments and analyzing data need access to all the relevant data to form an accurate and complete picture, draw insight and make necessary decisions.
A small team of scientists spends about 2,000 hours a year checking the quality and completeness of their data. Digitizing data management right now can help organizations hit the ground running once work goes back to normal. Better yet, by implementing a modern scientific informatics platform, any new projects will benefit down the line – shortening overall timeframes and reducing workload.
Another reason now is a good time to consider digitalization is to maintain communication between those working in the lab and those working from home. Scientists who are used to working at the bench may take some time to adjust to working at a computer all day. Opening lines of communication so that scientists feel engaged can make a world of difference for their morale and productivity. If ever there was a time to concentrate on getting the most out of data, it’s now.
Cloud software ensures integral and high-quality data
Moving data management to the cloud removes many of these pain points. With an integrated data management system in the cloud, such as IDBS’ E-WorkBook Cloud, all data is captured automatically and in a single location, streamlining the process, removing the element of human error and safeguarding data integrity. Scientists can easily access the information they need when they need it, and with validation checks, they can be sure the data is accurate and complete.
This is especially useful when it comes to demonstrating data integrity to regulators and meeting GxP compliance demands. Instead of searching through mounds of historical data, with the right tools, everything needed to compile a context-rich report quickly is at hand. For example, an organization with 94 scientists spends 8,000 hours a year on searching and reporting alone. With a digital platform, this can halve that time to just 4,000 hours a year.
Using data to improve collaboration and minimize unexpected setbacks
An end-to-end scientific informatics platform enables teams to access information and make informed decisions based on the data. Focusing on using data allows for a Quality by Design approach, where building in GxP-compliant processes accounts for every variation in the manufacturing process and ensures the quality of the product at every stage, rather than verifying its quality at the end. This way, manufacturers can ensure consistency between product batches and have more control over the timeline.
Cloud software can also provide a faster mechanism for regulatory bodies to access data for the “rolling reviews” that enable fast-tracked drug candidates. E-WorkBook’s granular security model can easily allow regulators to see and audit only the data they need, as fast as the lab produces it, for breakthrough approvals.
The EMA (European Medicines Association) is helping R&D organizations by providing scientific advice faster – from an average of 60 days down to just 20 days. Giving internal auditors and external regulatory bodies access to data means they can speed up the approvals process – the EMA would be able to fast track drugs or give scientific advice and speed up trials – something sorely needed during a pandemic.
Boost data accuracy and gain insight
A well-designed process, formed off the back of high-quality data sets, helps to avoid costly errors and re-work.
Automating data collection increases speed and accuracy and means manufacturers can adjust their approach to data: the focus shifts from capturing data to analyzing it and drawing insight. Routinely assessing and learning from data is something regulators such as the EMA and the FDA (Food and Drug Administration) now expect. Not only will this approach result in a quality product for the market sooner and reduce the overall workload, it is also one that regulatory bodies can applaud.
A holistic scientific informatics platform facilitates data capture with context, boosts team collaboration and information sharing, highlights complex trends and safeguards data integrity. All movement is logged, and variations are flagged at the source, before they become issues. Vaccine manufacturers and CDMOs end up with reproducible and consistent data that they can use to drive decisions. Moving data management to the cloud can help biopharmaceutical manufacturers get their therapies and vaccines to market faster. And today, more than ever, that’s what patients really need.
About the author:
Roman Vincent is the director of strategy and innovation at IDBS.