We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

How High-Throughput Screening Is Transforming Modern Drug Discovery

Multichannel pipette dispensing samples into a 96-well plate during high-throughput screening.
Credit: iStock
Read time: 6 minutes

High-throughput screening (HTS) has come a long way from the early days of robotic plate readers churning through tens of thousands of samples with little more than a binary “hit or miss.” Today, HTS is faster, smarter and incredibly more nuanced, with the ability to evaluate compound libraries not just for activity, but also for selectivity, toxicity and mechanism of action in the same workflow.


Modern drug discovery demands speed without any sacrifice in quality. Pharmaceutical pipelines are under pressure from patent cliffs, escalating R&D costs and the urgent need for more targeted, personalized therapeutics. In this climate, HTS has become both a workhorse and a high intensity proving ground for scientific ideas that need to hit the ground running.


In the early 2000s, your typical campaign might have screened hundreds of thousands of compounds against a single biochemical target, yielding a tremendous amount of data that would later be whittled down to leads. Now, with exceptional advances in assay miniaturization, robotics and data analytics, campaigns are leaner and far more precise, sometimes screening a fraction of the compounds but generating exponentially richer data.



Advertisement

To appreciate just how far HTS has come, it is worth remembering the nuts and bolts. Assay plates, for example, continue to form the backbone of any workflow. Liquid-handling robots pipette compounds into wells with astonishing accuracy, cells or enzymes are introduced and plate readers or imaging systems record the resulting interactions. The principle of parallelization remains simple: run as many biological experiments as you can, as fast as you can. What has changed is the depth of information extracted from each well. Where a simple “yes/no” signal once sufficed, researchers can now capture vast, multi-parametric data on morphology, signaling and even transcriptomic changes in the same assay.


“The most exciting change has been the move from flat 2D culture to 3D cell models integrated with high-content screening,” said Dr. Tamara Zwain, a lecturer in pharmaceutical science at the University of Lancashire. “In my own work with 3D blood–brain barrier and tumor models, we saw completely different drug uptake and permeability behaviors compared to 2D culture. For researchers, this means we’re getting results that are much closer to what we’ll see in patients.”


Alongside the biological advances, technical innovations have transformed the front end of HTS. “In the past everything was pipetted by hand, but now that’s almost entirely gone,” Laura Turunen, a laboratory coordinator at the Institute for Molecular Medicine Finland, explained. “Acoustic dispensing and pressure-driven methods with nanoliter precision have become available, making workflows incredibly fast and far less error-prone.”


However, pitfalls still remain. Zwain warns that rushing assay setup is “the fastest way to fail later,” stressing that speeding experiments during optimization at the expense of robustness almost always backfires. “Many users like to stick to their 96-well plates and absorbance readouts because they’ve always worked before.” Turunen agreed, “It feels risky to adopt something new, especially in academia, where HTS is so expensive.”

3D cell culture in assay development

Conventional HTS often relied on cell cultures strictly in two dimensions: monolayers of cells in flat plates. They were easy to handle, easy to image and ideal for automation. However, they were considered to be biologically simplistic. Cells in a petri dish don’t behave quite like cells in a tissue or organ, and drug candidates that looked promising in 2D sometimes failed spectacularly when tested in animals or humans.


Newer cell culture technologies that work in three dimensions, ranging from spheroids and organoids to scaffold-based systems, are bridging this gap. They provide a more physiologically relevant microenvironment, allowing cells to interact in ways that mimic real tissues. This improved realism translates to better predictability of clinical outcomes, especially for complex diseases like cancer and neurodegeneration.



Advertisement

“The beauty of 3D models is that they behave more like real tissues. You get gradients of oxygen, nutrients and drug penetration that you just don’t see in 2D culture,” said Zwain. “When I worked with glioblastoma spheroids, I found that nanocarriers easily got into the actively dividing cells on the outside but struggled with the necrotic core in the middle. That’s exactly the kind of behavior that mirrors what happens in a patient, and it’s why 3D models give us insights that are so much more translatable.”


However, it isn’t a case of out with the old and in with the new. While 3D culture is widely hailed as the future, many labs still use 2D alongside 3D for practical reasons. “3D models are becoming increasingly popular, but in many labs we still see 2D and 3D run side-by-side,” said Turunen. “Because imaging takes so much time, viability readouts are often still the default for 3D experiments. It’s always a balance between realism and practicality.”


One particularly exciting development is the rise of patient-derived organoids, which can be used to test drug responses in a genetically and phenotypically relevant system before clinical trials even begin. “Organoids are going to become a standard part of the pipeline, probably not for the first screening round, but for validation,” said Zwain. “That way you catch variability and resistance early, before spending years on a compound that won’t translate.”


Turunen agreed that patient-derived organoids are likely to move from specialist systems into more routine use, but also urged caution about assuming they will become the sole standard. “As culturing techniques evolve and costs come down, organoids will become mainstream,” she said. “But it’s equally possible that totally different approaches, like microfluidics-based droplets, could leapfrog them. The technology is moving so quickly that we have to stay open-minded about what the dominant model will be.”

Richer, better, faster data

Of course, the best assay and the most realistic model are only as useful as the data you can extract, and here is where automation and detection technologies can truly transform HTS.


Once dominated by simple absorbance or luminescence readouts, HTS detection has expanded dramatically. From high-content imaging (HCI) and fluorescence resonance energy transfer (FRET) to exciting advances in label-free biosensing and even AI-enhanced live-cell imaging that can spot subtle phenotypic changes invisible to the human eye.


Meanwhile, the automation that drives HTS is no longer just about robots moving liquids around. Modern platforms integrate liquid handlers, robotic arms, imaging systems and data capture tools into seamless workflows. Compounds dispensed into wells by robots can be whisked directly to a plate reader, with analysis software immediately processing dose-to-response curves and identifying potential hits. This kind of end-to-end automation has made HTS not only faster but also far more reproducible and reliable.


Advertisement

The challenge now is not generating data but interpreting it. Multiplexed assays can produce terabytes of information in a single campaign.


Advertisement

“The first thing I always say to my students is: start with a clear biological question. Then build your assay around that,” said Zwain. “Use tiered workflows. Broad, simple screens first, then save the deeper phenotyping for the compounds that really deserve it. And invest early in data infrastructure to save you from so many headaches later.”


AI and machine learning are already proving valuable in this space. “It remains to be seen, but pattern recognition is one area where machine learning could really shine, especially for imaging data,” noted Turunen. “It has the potential to help us analyze complex images far more effectively than manual methods ever could.”

High throughput, higher potential 

HTS is no longer a question of making things quicker; it’s about providing higher value through precision, context and translatability. The convergence of 3D biology, advanced detection and integrated automation is creating a feedback loop where each innovation fuels the others.


“If we fast-forward to 2035, I think HTS will be almost unrecognizable compared to today,” predicted Zwain, “We’ll be running organoid-on-chip systems that connect different tissues and barriers, so we can study drugs in a miniaturized ‘human-like’ environment. Screening will be adaptive, with AI deciding in real time which compounds or doses to try next.”



Advertisement

“Over my career, HTS has shifted from massive library screens to smaller, more focused efforts,” added Turunen. “By 2035, I expect AI to enhance modeling at every stage, from target discovery to virtual compound design. Add in quantum computing, and molecule predictions could become so accurate that wet-lab screening is reduced, cutting waste dramatically.”


So, the future of HTS is hopeful, looking increasingly modular, adaptive, more personalized and, with reduced waste, a far more sustainable venture. But for all the technological advances, the core principle remains the same: the faster and more accurately we can identify promising compounds, the sooner they can move into development, and the sooner patients might benefit.


The tools may be unrecognizable from those of twenty years ago, but the mission to find the right molecule for the right target remains steadfast. To finish, I leave you with a question: Will the future of HTS be as much digital as it is biological?