We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.


The Next 10 Years: What’s Coming for Gene Editing in the Clinic

Gene edited DNA helix.
Credit: iStock
Listen with
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 4 minutes

With multiple biotech companies racing to bring their sickle cell gene editing treatments across the US Food and Drug Administration (FDA) approval finish line, the stage is set for 2023 to be a milestone year for genetic medicine. Sickle cell anemia is a rare disease with limited treatment options. However, a new gene editing treatment, when approved, will become the first marketed therapy based on CRISPR-Cas9 technology. It’s a special moment for the life sciences a highly anticipated accomplishment for CRISPR as well as an often-neglected disease.

Last summer marked the 10th anniversary of Jennifer Doudna and Emmanuelle Charpentier’s groundbreaking paper where they first described CRISPR-Cas9. This technique, which ultimately earned them the 2020 Nobel Prize in Chemistry, gives scientists the ability to edit DNA with greater precision than ever thought possible.

Now, a decade later, numerous gene editing techniques and therapies for diseases beyond sickle cell are being evaluated in clinical trials. The next 10 years will likely bring gene editing to market for an array of rare, debilitating conditions. However, making the potential benefits of gene editing broadly available won’t be without roadblocks.

Here’s what the industry will have to contend with:

Overcoming safety hurdles

The industry has been cautious, taking safety concerns seriously, and the next decade will likely bring significant progress in overcoming challenges related to off-target effects, on-target effects and gene delivery.

As therapy developers continue to improve on the effectiveness of gene editing, with many techniques showing upwards of 90% efficacy to target specific genes, they are also looking closely at the risk of potential off-target effects and how to prevent them. Double-strand breaks, which can occur with various technologies including CRISPR, TALEN and zinc-finger nucleases, can cause different responses across different cells. Not only is the industry learning how to better predict these breaks, but it is also learning that the on-target resolutions of these breaks can have many outcomes. For example, the normal repair process for a double-strand break is to stitch these strands back together. However, we’ve learned that sometimes these breaks can be massive – deleting up to hundreds and thousands of bases. To mitigate, the industry will increasingly look to tools like droplet digital PCR to monitor the loss of allele specific SNPs and long-read sequencing to evaluate the target site more closely. Base and PRIME editors may also be key, as they have a much smaller chance of making a double strand cut.

To address safety concerns related to viral vectors, researchers are pursuing alternative delivery methods for gene editing. Electroporation, a transfection method that uses electrical pulses to create temporary pores in the membranes of cells so other materials can pass through, is growing in popularity. Lipid nanoparticles (LNPs), which are key to mRNA-based therapeutics, are also now highly sought after. In addition, replacing high multiplicity of infection viral donors like AAV with non-viral DNA is being developed.

Navigating the regulatory process

Amidst the excitement surrounding gene editing breakthroughs for sickle cell, we can’t lose sight of the barriers impacting the development and approval of other rare disease drugs. Many rare diseases have genetic cures waiting in the wings, but these therapies aren’t receiving approval and making their way to patients. Why? As the National Institutes of Health (NIH) put it: “For many rare diseases, the limiting factor for developing a gene therapy is not scientific knowledge, but rather operational and financial hurdles.” Simply, with such small patient populations, it’s often not financially viable or even possible to put these treatments through clinical trials. The solution, which both public and private entities are already working toward, is standardizing development to simplify the regulatory process.

Multiple NIH initiatives are underway to reach across the aisle and work with both industry and patient advocacy groups to streamline gene therapy development. Launched in October 2021, the PaVe-GT program is using the same delivery system and manufacturing methods to develop gene therapy treatments for four rare diseases. This effort has been broadened with the launch of the AMP Bespoke Gene Therapy Consortium which includes many industry leaders, including Thermo Fisher Scientific, and non-profit organizations working across a greater array of diseases. The goal of both programs is to develop and share a clear “playbook” so that these treatments can reach patients faster.

For example, standardization could streamline the front-end workflow to use a universal edit site, so that off-target effects are already addressed, and the regulatory process can focus on what is inserted.

Pushing for greater access

As to be anticipated when new, exciting treatments are on the horizon, improving patient access will be a top priority over the next few years. Unsurprisingly, one of the biggest hurdles patients face is cost – although the lifetime cost for patients living with rare diseases can be even more expensive. In the case of sickle cell, Vertex Pharmaceuticals’ and CRISPR Therapeutics’ gene editing therapy would reportedly be cost-effective if priced up to $1.9 million. Still, over the next decade, the industry will need to consider how to reduce prices to reach more patients.

One potential solution is the continued development of allogeneic workflows, which use donor cells to develop treatments for multiple patients, versus autologous workflows, which use a patient’s own cells to develop a personalized therapy. With each patient serving as their own donor, autologous therapies are very difficult to make affordable. What would an allogeneic workflow look like comparatively? It could be possible to make changes in cells, store and freeze them, then ship them out to reach patients at various hospitals where they can be infused. Another potential path would be the direct delivery of gene editing to the body, where no external cell processing is done.

Where we go from here

Over the next decade, there’s no doubt that advances in science will continue to improve human health in incredible ways. As we enter the golden age of biology, the industry can evaluate how we can reduce the cost of new therapies to increase affordability and accessibility while still generating a return for innovators.

When it comes to gene editing, the next 10 years will bring new therapy approvals, many clinical trials and lessons learned across the industry. Hopefully, the next decade will also bring greater access as we all work to scale these next-generation solutions.

About the author:

Since graduating from Cornell, Jason Potter has spent 20 years working in biotech R&D. He has led research focusing on synthetic genes and the development of enzymes, including SuperScript III and other reverse transcriptases and polymerases, at Invitrogen/Life Technologies. He now leads the genome editing R&D team at Thermo Fisher Scientific in Carlsbad, California. His group is focused on developing and improving tools for genome editing using the TAL and CRISPR technologies.