Rethinking Target-Based Drug Discovery: Challenges, Innovations and the Next S-Curve
A third wave of drug discovery is emerging where AI and multiomics guide smarter target selection and validation.
Drug discovery has long relied on two key strategies: target-based and phenotypic screening. Recombinant technologies and genomics opened the door to modern target-based drug discovery (TBDD), allowing researchers to identify key genes implicated in disease and screen large compound libraries against defined molecular targets. As sequencing of the human genome edged closer to completion, the number of potential targets grew rapidly, fueling enthusiasm for target-led approaches. This was mirrored in regulatory outcomes. For example, between 1999 and 2013, 69% of first-in-class regulatory approvals (40% small molecules, 29% biologics) made by the US Food and Drug Administration came from target-based discovery.
However, as the approach was adopted more widely, its limitations became more visible. Many drug discovery programs that looked promising early on struggled to translate later in development. A recent systematic review of over 30,000 studies spanning 150 years found that only ~ 9% of approved small-molecule drugs were truly discovered through target-based methods, and even those that were often relied on off-target mechanisms for their therapeutic effects.
By the mid-2000s, experienced drug discovery leaders started to question if the industry’s pivot towards target-driven strategies had introduced new bottlenecks. Among them was Dr. David Brown, who has decades of experience in pharmaceutical research and biotech leadership, serving at several top pharmaceutical companies and contributing to multiple successful therapeutic programs. Drawing on S-curve theory (Figure 1), Brown argued in 2007 that after an initial period of rapid progress, target-based discovery had entered a productivity plateau.
As one industry manager observed at the time: “For the past decade, the pharmaceutical industry has experienced a steady decline in productivity, and a striking observation is that the decline coincided with the introduction of TBDD.”
In this article, we examine ongoing challenges shaping TBDD and highlight how new technologies and interdisciplinary approaches are helping to address them.
The enduring challenge of target validation
Even targets backed by solid mechanistic reasoning can fail once they reach patients. The field has become fixated on selecting a single target to treat a disease, even though human biology is far more complex. A drug acting on one target often cannot capture that complexity.
Recombinant systems add further limitations: they don’t reliably reflect human physiology, so a target that behaves well in a cell-based assay or model organism may not translate effectively. This gap between preclinical promise and real therapeutic effect – known as the “valley of death” – remains one of the major bottlenecks in drug discovery.
“In the 1990s, with genomic data increasingly available, pharmaceutical companies changed almost entirely to target-based approaches and abandoned most phenotypic research. That was a gamble that only partly paid off, and in retrospect, I think we can say that a more balanced approach would have been more productive for the industry,” noted Brown.
“Most drugs act on multiple targets, so optimizing for a single target has undoubtedly reduced the rate of breakthrough inventions. Most drugs optimized for a single target later fail in clinical trials, after many years of effort and expenditure,” he added.
Brown’s decades in the field have taught him that human bias plays a major role in how targets are selected, often leading to failures downstream. On top of that, the industry tends to focus on the same targets. Artificial intelligence (AI), however, offers a new approach with reduced bias.
“We can let the data point us to the right targets (plural) and the right molecules. My latest drug, HLX-1502, for the treatment of Neurofibromatosis 1 tumors came from this approach. It is currently in a Phase 2a clinical trial in the USA. We can even let the data design molecules for us,” he said.

Figure 1: S-curve model of technology adoption and performance modes (induction, payback, obsolescence). Adapted with permission from David Brown. Credit: Technology Networks.
When considering TBDD risk, it can be broadly split into two categories. “Validation risk” reflects the uncertainty that modulating a target will produce a meaningful therapeutic effect. “Technical risk” relates to the practical challenge of finding a molecule that can safely reach, engage and modulate the specific target in humans.
Both risks can derail pipelines, even late in development. Dr. Rachael Dickman, lecturer in drug discovery at University College London, highlighted some of the key issues she faces: “Challenges in peptide discovery in my experience are mostly regarding formulation/delivery and stability. With the use of peptide library technologies (e.g., phage display and mRNA display), it is now possible to find ligands with high affinity (low nM or pM) for a desired target much more quickly, but whether the identified peptides are sufficiently ‘drug-like’ to reach the target tissue or have long enough half-life to be therapeutically useful is still challenging.”
Robust validation is essential to “stress-test” a target before significant time and money are invested in advancing a compound.
She noted that “use of tool compounds to demonstrate that target modulation has the desired effect on relevant biomarkers and pharmacology, both in vitro and in vivo, gives additional confidence that the target is likely to be druggable.”
Her lab is developing peptide drugs for infections and other diseases, particularly those driven by antibiotic resistance. To do this, they synthesize complex cyclic peptides and design analogues via solid-phase chemistry, then explore their structures and mechanisms using NMR spectroscopy and a range of biophysical tools.
Emerging tools and modalities in target-based drug discovery
More recently, new tools and approaches are emerging with the potential to reshape the field. For example, in peptide discovery, advances in library technologies and rational chemical modifications are helping researchers overcome stability and tissue-delivery challenges.
Meanwhile, computational methods enable rapid in silico screening and optimization of candidate molecules, reducing both time and cost.
Dr. Avner Schlessinger, professor of pharmacological sciences at the Icahn School of Medicine at Mount Sinai, leads a lab focused on streamlining drug discovery through computational chemistry and AI to study disease pathways.
He explained the transformative power of AI: “Traditional medicinal chemistry explores chemical space through manual, iterative optimization ‒ a process that is often slow and heavily dependent on human intuition.”
“AI dramatically accelerates and expands this process, particularly when combined with expert-driven insight. In our center, we routinely use deep-learning-based methods to address biological problems, such as in molecular docking and active learning, where we virtually screen libraries with billions of synthetically accessible compounds, guided by AI models that improve with each iteration.”
By integrating patient-derived multiomics datasets, network-based AI models and structure-based design, his team can now identify disease-relevant targets that are both genetically implicated and druggable: “Using these approaches, we have identified novel targets for various central nervous system disorders and cancer, and have applied AI-driven drug discovery and structural biology to develop promising tool compounds.”
“Since our focus is on the preclinical stage rather than advancing compounds directly into the clinic, we prioritize the development of high-quality tool compounds that rigorously validate a target’s function in disease-relevant models. This helps de-risk the biology early and supports downstream therapeutic investment,” he added.
“AI enables fast, iterative feedback loops between target identification, molecular modeling, and experimental validation. This dramatically accelerates the timeline from target discovery to hit optimization, which was previously a slow and resource-intensive process,” Schlessinger said.
While this approach can increase confidence and help identify the most promising targets, it’s worth noting that it does not fully overcome the inherent limitations of a single-target approach.
It’s important we keep the complexity of disease front of mind: multiple targets and factors often influence a drug’s efficacy and safety. Improving the success of TBDD requires leaving this single-target mindset behind ‒ it isn’t helpful to become too fixated on one gene or protein.
Approaching the next S-curve: rethinking productivity in drug discovery
Using Brown’s S-curve framework, we can reflect on productivity and look to predict the possible next phase of drug discovery. The initial surge in performance following the switch from phenotypic to target-based approaches eventually plateaued, partly because the field tried adopting the new technology before it had fully matured.
“Target-based strategies started to become feasible at scale in the 1980s with the invention of fast protein liquid chromatography, which greatly accelerated the isolation of individual biochemicals at purity levels suitable for testing drug molecules (Early Adopters). Then, it accelerated in the 1990s (Early Majority), when genomic data began to appear. The obsolescence phase has been with us since the mid-to-late 2000s. We needed new approaches by that time, but we’ve had to wait until now,” explained Brown.
As a result, there’s been a gap of almost a generation between the second and third S-curve (Figure 2).
The timing of S-curves matters just as much as their shape: “The key point is that any industry needs a new S-curve to kick in before or at the time an old one begins to taper off at the top of the S, with diminishing productivity (Obsolescence Mode). If there’s a gap between one S-curve and the next, an industry is in trouble; its productivity languishes. Unfortunately, that has happened in the pharmaceutical/biotech industry.”
Figure 2, which shows the three waves of drug discovery productivity, illustrates this dynamic. Brown has witnessed all three phases: “During my career of over 50 years inventing new medicines, I have seen three waves: two completed S-curves and a third underway – it’s now at the end of the flat induction phase and beginning to pay off. But there has been a gap between the second and third waves. Just as the first wave ran its course by the late 1980s, the second plateaued by about 2010. We needed a new third wave to be in payback mode, but it was actually only partway through the bottom line of the third S-curve, between the ‘Innovators’ and ‘Early Adopters’ phases.”
Brown now believes the field is moving beyond the early adoption phase of the third S-curve (Figure 2, final green data point) and entering the growth/payback phase. In doing so, we’ll be able to compensate for the plateau and decline (second orange data point) at the end of the second S-curve. This transition also brings a shift in methodology.
Brown thinks the coexistence and reintegration of multiple drug-discovery approaches is key to this new wave: “The first wave of phenotypic drug discovery was largely replaced by the second wave of TBDD and now a third wave of AI-based multiomics methods is coming into play. Actually, I think all three approaches have their uses if used appropriately.”

Figure 2: Successive waves of drug discovery productivity. Adapted with permission from David Brown. Credit: Technology Networks.
Both Dickman and Schlessinger recognize that drug discovery is entering a phase of major change. Dickman noted, “I think new modalities are especially important ‒ as we try to develop therapies for increasingly complex diseases (e.g., neurodegenerative diseases, cancers) new modalities will be needed to effectively modulate targets.”
“AI is driving a paradigm shift in drug discovery by enabling systems-level targeting and rapid iteration. It’s now possible to identify not just individual targets but context-specific vulnerabilities such as those unique to particular disease states, cellular environments, or genetically defined patient subpopulations,” said Schlessinger.
Brown believes AI/machine learning using multiomics data is key: “This is our best chance to improve on the awful record of recent decades, in which the majority of preclinical and clinical projects have failed after years of work and huge amounts of money spent, and often it’s because of the single target that was selected in the very first step of the drug discovery process.”
Collaboration across academia, startups and industry
As the field works to overcome long-standing bottlenecks in TBDD and push toward the payback phase of the next S-curve, collaboration becomes even more important.
Academic researchers tend to have the freedom to take on higher-risk projects and explore novel mechanisms, whereas industry professionals bring expertise in clinical development and experience navigating regulatory requirements to produce medicines at scale.
However, a key driver of current innovation is techbio startups. These companies, which are often funded by venture capital at a scale beyond academic labs, are at the forefront of developing new tools, modalities and approaches in drug discovery. They bridge the gap between early-stage research and later-stage industry development.
Dickman highlighted the value of this synergy: “From an academic perspective, collaborating with industry in drug discovery is hugely valuable, not only because of the expertise across drug development, but also because it helps focus research efforts, e.g., towards something which is potentially commercially viable and therefore may eventually be of benefit to patients.”
Schlessinger added that these partnerships are strengthened through training pathways: “Many of our PhD students and postdocs gain experience in translational AI and drug discovery through internships with startups or industry, creating a long-term, bidirectional flow of talent and knowledge between sectors.”
Conclusion
Advances in chemical biology, structural methods, multiomics and AI-driven modeling are reshaping how researchers identify and validate targets. New therapeutic modalities, particularly peptides and targeted protein degradation, are expanding what counts as “druggable”.
At the same time, there now seems to be more emphasis on building stronger biological evidence before design begins, helping to reduce avoidable failures later in development.
Hopefully, these tools and lessons will translate into a more consistent flow of effective new drugs, particularly for complex diseases.
“Used together, these approaches [AI and multiomics] provide an alternative approach to drug discovery that bridges chemistry and biology, removes human bias and lets the data tell us what should work. This is the future, the third wave, the third S-curve,” concluded Brown.