We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Unlocking Deep Proteomic Analysis for Population-Scale Studies

An image representing a figure made of molecules, highlighting deep plasma proteomics.
Credit: iStock.
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 6 minutes

At this year’s American Society for Mass Spectrometry (ASMS) conference in Baltimore, Seer unveiled its latest advancement – the new Proteograph® Product Suite featuring the Proteograph ONE Assay and SP200 Automation Instrument.


Seer believes the new technologies could revolutionize population-scale proteomics studies and accelerate discoveries across drug development, diagnostics and personalized medicine.


At ASMS, Technology Networks had the pleasure of interviewing David Horn, president and CFO of Seer. In this insightful interview, Horn shared the vision behind the Proteograph Product Suite and how it can address both longstanding challenges in throughput and sensitivity while also democratizing access to deep proteomic analysis.


We also explored real-world applications, including a landmark 20,000-sample cancer proteomics study led by Korea University, with Horn offering his perspective on the future prospects for mass spectrometry (MS)-based proteomics over the next five years.

Molly Coddington (MC):

For readers that are unfamiliar with Seer’s product lines – what is the company’s goal in the proteomics space and how is it achieving this with its products?


David Horn (DH):

Our goal has always been to fundamentally change the arc of proteomics by commercializing technologies that unlock the complexity of the proteome, enabling new biologic insights that will improve health.


Our Proteograph Product Suite, including our new Proteograph ONE workflow that we launched last week, allows our customers to analyze the proteome at a scale, speed and depth in an unbiased manner that was previously not possible. By enabling population-scale studies, as demonstrated with our announcement regarding the 20,000 sample cancer study being conducted by Korea University, researchers can now uncover deep, unbiased proteomic insights at the peptide level, which will provide unique biological insights to catalyze advances in drug discovery, diagnostics, personalized medicine and basic science.


We pioneered the ability to do deep, unbiased proteomics at scale when we launched the Proteograph Product Suite four years ago, and since then we have continued to be the leaders in furthering the advancement of the technology and becoming the trusted partner of leading commercial and academic institutions worldwide.


The technology behind the Proteograph Product Suite is based on decades of research in nanoparticles, the nano-bio interface, machine learning and systems biology. This expertise has allowed us to deliver a solution that meets the needs of today’s biological researchers, who expect to be able to explore the proteome with more depth, precision and scale, and want to move beyond the information provided by targeted proteomic tools.



MC:

Can you talk about the feedback you received on the Proteograph Product Suite launched in 2021 – how did it help to create the new Proteograph workflow?


DH:

The original Proteograph Product Suite provided a step-change breakthrough for unbiased proteomic analysis, enabling MS-based proteomics with a depth and reproducibility that no one had ever seen, especially in plasma.


From our initial launch of the Proteograph Product Suite in 2021, customers provided feedback for improvements in four primary areas:

  • Increase sensitivity
  • Increase throughput
  • Decrease sample volume requirements
  • Continue to enhance analytical functionality in the Proteograph Analysis Suite (PAS).


With the launch of our Proteograph XT workflow in 2023, we addressed several of these areas of feedback directly. Proteograph XT provided increased sensitivity and throughput, and we have and will continue to enhance functionality in PAS.


While Proteograph XT increased throughput by 2.5x over the previous workflow, the throughput remained below what was needed for true population-scale plasma proteomic analysis. Deep proteomic workflows, although powerful, were typically not feasible for tens of thousands of samples, due to a combination of throughput limitations, the need for multiple mass spec injections per sample and the resulting per-sample cost for this analysis.

The new Proteograph ONE workflow removes these limitations by doubling throughput per run on the SP200 instrument, halving the sample volume requirement and only requiring one MS injection per sample.

In addition, the automated run time of the instrument for the Proteograph ONE assay is less than 5 hours for an 80-sample run, which increases throughput to over 1,000 samples per week from a single SP200 Automation Instrument.

These workflow efficiencies, combined with other operational and cost efficiencies, including improvements in MS instrumentation, have helped to lower the average cost of analyzing a sample by ~60% since we first introduced the Proteograph Product Suite in 2021. We expect these capabilities to unlock not just the possibility, but also the practicality of true population-scale, deep, unbiased, mass spec-based proteomic research. Our announcement about the 20,000-sample study initiated at Korea University shows that population-scale, unbiased plasma proteomic analysis has become a reality.



MC:

Are you able to discuss any customer case studies that showcase the potential applications of the new Proteograph workflow?


DH:

The partnership with Korea University, working on a 20,000-plasma-sample study, is the first of its kind: a deep, MS-based population proteomics study of cancer in young adults.


Professor Sang-Won Lee at KU’s Center of Proteogenome Research is leading the study to discover biomarkers that catalyze the development of new diagnostics that could change the way cancer is treated among this group.


Professor Josh Coon (University of Wisconsin-Madison) and Professor Gary Patti (Washington University) also highlighted some compelling preliminary data from their early access experience with the Proteograph ONE assay at Seer’s ASMS breakfast seminar.


Prof. Coon’s presentation highlighted an impressive improvement in protein group IDs using the Proteograph ONE versus a neat digest protocol for monkey samples, demonstrating the versatility of Proteograph workflows to handle human and non-human samples – a unique feature that has been demonstrated repeatedly on prior Proteograph assays and is preserved in this latest iteration.


Prof. Patti noted promising results from a pilot study that implicated the roles of three novel proteins in health and obesity. Of note, none of the three proteins are included on the SomaScan panel used in a parallel arm of the same study.

There are also at least 14 posters and presentations featuring Proteograph data being shared at ASMS this week. We feel like this is just the tip of the iceberg for this type of research and expect more exciting publications and new large-scale studies to be announced soon.


MC:

How does the new product line support the democratization of MS tools in proteomics research?


DH:

Deep MS-based plasma proteomics has historically been limited to researchers with a specialized and deep expertise in MS, with access to the right MS resources and a willingness to invest significant time.


Given these requirements, the universe of researchers was limited to a small group. Given the cumbersome and manual workflow for sample enrichment upstream from the mass spectrometer, even these MS experts were not able to scale their plasma proteomic studies. As we have discussed, the Proteograph Product Suite fundamentally changed unbiased proteomic analysis by removing this bottleneck and making mass spec-based analysis more accessible.


As we start to push up against the limitations of what affinity-based targeted panels can do, there is a renewed recognition of MS as the gold standard for deep, unbiased proteomics and a significant uptick in the interest we are seeing from researchers eager to leverage the technology. Our goal at Seer is to make our technology as easily accessible to as many researchers as possible and we accomplish this in several ways.


For those who are not new to MS, the Proteograph Product Suite represents an end-to-end solution that is detector-agnostic, meaning the sample prep automation, nanoparticle technology and data analytics will wrap around whichever mass spectrometer you already have in your lab – transforming the ability of existing equipment to produce large-scale studies at a depth, efficiency and cost that would have been unimaginable a few years ago.


We offer access to the Proteograph Product Suite through our Centers of Excellence, other certified service providers and our own Seer Technology Access Centers (STACs) in the US and Europe. These options provide a turnkey solution for customers who do not have the time, budget or expertise to bring the Proteograph Product Suite and MS in-house and prefer to simply send in their samples and get data back.


We have noted that the cost of analyzing a sample has been significantly reduced over the past few years, which is good news for everyone. What has not come down in cost is the price of obtaining precious patient samples. Researchers know that this is the most expensive part of any study and they are not willing to risk those expensive, often irreplaceable samples, on unproven technologies. That is why we have been obsessive about developing the Proteograph Product Suite in an incredibly robust manner and now have a growing body of third-party evidence to demonstrate the quality of the data it produces.



MC:
Solutions such as the new Proteograph Product Suite are helping to address long-standing challenges in MS-based proteomics. If the challenges of scalability, sensitivity and speed can be overcome, what challenges remain?  

DH:

There are a couple of challenges that remain. First, MS instrumentation remains intimidating and inaccessible for many non-MS researchers. In order to get widespread adoption beyond the MS community for unbiased proteomics, it would be helpful to make MS instruments more approachable and easier to use for the purpose of proteomic analysis.


In addition, the robustness of the datasets produced by the Proteograph Product Suite and MS are incredibly rich, providing proteomic information at the peptide level. The need for robust data tools that are intuitive, scalable and provide biological insight, like those provided by our Proteograph Analysis Suite, will only increase as the size and complexity of the proteomic studies increase. We need to continue to enhance the data analysis tools and pipelines to allow researchers to easily obtain the biological insight they are seeking from these enormously rich datasets. 



MC:
With products such as the new Proteograph workflow available to researchers, what do you think the next five years of MS-based proteomics research could look like?

DH:

Given Seer has and will continue to break down the barriers to conducting unbiased proteomics at scale, the next five years will be an incredible period of discovery and understanding of health and disease.


Think about what happened when genomics began to scale 15 years ago and how far we have come in our understanding of biology in that time. By making unbiased proteomics accessible at a depth, scale speed and cost that will allow for population-scale studies, researchers will unlock the complexity of biology in a manner that has not previously been possible.


I think the next five years will see a step-function change in our understanding of biology. We are only at the tip of the iceberg in terms of our knowledge in this area. It is extremely exciting what lays ahead and the implications for the diagnosis and treatment of all types of disease. It will be amazing to watch.