We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

What Is Driving the Most Impactful Change in Proteomics?

Digital visualization of a molecular structure surrounded by data, symbolizing proteomics analysis.
Credit: iStock.
Read time: 5 minutes

Proteomics is in the middle of a transition: from an exploratory, technology-driven discipline to one increasingly defined by robustness, scale, and real-world impact.


To capture how this moment is being interpreted by leaders shaping the field, Technology Networks asked a group of industry experts the same foundational question: “From your perspective, what technological or methodological advances are driving the most impactful change in proteomics right now?”


Their responses span a wide range of innovations, from mass spectrometry (MS), affinity-based platforms, sample preparation, computational analysis, and clinical translation, which together are redefining what proteomics can deliver.


Jenny Samskog, PhD, head of product management, Olink Proteomics, part of Thermo Fisher Scientific.

The most significant advances are being driven by a clearer, more quantitative understanding of what is actually measured and how different proteomic technologies correlate with one another. Progress in data standardization and harmonization is crucial, as is the integration of large, heterogeneous datasets through machine learning approaches.


 These developments are enabling more accurate comparisons across studies and platforms, facilitating reproducibility, and allowing researchers to connect proteomic measurements to functional and clinical phenotypes with greater precision.

Katherine Tran, senior global market development and marketing manager, Proteomics, SCIEX.

I’m seeing five major trends shaping proteomics right now:

1. Ultra-sensitive / single-cell level proteomics is changing the proteomics landscape as it allows the discovery of biological variation that would otherwise be hidden in bulk averages, such as rare cell-states and early disease signatures, etc. In addition, ultra-sensitive single-cell proteomics resolves cell-to-cell differences rather than population-based, enabling proteomics to play a role closer to what single-cell RNA-seq has done for transcriptomics.

 

This area facilitates better precision/personalized medicine as single-cell proteomics enables profiles of individual patient cells or microenvironmental niches, rather than homogenized samples.

 

2. Advancements in SWATH DIA (data independent acquisition) continue to expand the depth of proteomic insight from MS analyses, including better coverage, better quantitation and reproducibility, and better accuracy of results, all while requiring less sample and faster time for analysis. Advancements in SWATH DIA expand proteomics from “discovery only” towards more routine, robust workflows such as clinical applications.

 

3. Spatial proteomics and context-aware protein profiling are impactful to the advancements in proteomics, as biological function is often spatially dependent. Therefore, the ability to map protein-to-protein interactions, post-translational modifications (PTMs), and such, in which cell, which compartment, and which tissue microenvironment helps interpret the proteome.

 

4. AI, machine learning, and computational proteomics unlock actionable insight from large and complex datasets without strong computational resources. AI can be utilized to push forward proteomics from an ID-based analysis to a predictive/functional proteomics context (what are the proteins doing, how do they interact, how do they change under perturbation).

 

5. Multiomics integration and proteoform-level complexity enable richer, more biologically meaningful insights. For example, you might see a mutation in a gene or an increased transcript, but unless you see the protein and its PTMs, the functional consequence remains unclear.

 

For translational, clinical research, multiomics plus proteomics gives you more of a holistic molecular phenotyping, which better supports biomarker discovery and therapeutic stratification. Understanding proteoforms means more precise mechanistic insight (not just upregulation of a particular protein, but which version of the protein and with which PTM(s) are upregulated).

Henrik Everberg, PhD, chief executive officer, ProteomEdge.

The biggest change in proteomics today is the move from exploratory measurements to robust, quantitative, and specific multi-protein analysis. Targeted liquid chromatography-MS combined with stable isotope–labeled standards is gaining more interest as it promises to deliver absolute and reproducible protein quantification, which is essential for clinical and applied use.


Large-scale discovery proteomics remains essential as the first step to identify and prioritize candidate biomarkers, while targeted proteomics is becoming the method of choice for follow-up and future applications in a clinical setting. At the same time, clinical translation introduces new challenges around throughput, automation, and standardization, as future applications will involve very large numbers of samples analyzed in routine settings.

Sameer Vasantgadkar, senior manager, Omics Solutions at Covaris.

I would say that there are multiple levels at which this can be viewed. Everyone talks about the rapid advancements that have been made in the MS space, and all of those do help drive the technology forward significantly, but I think focus is also shifting on the front end—the sample prep. If you have a good quality sample and sample prep, the results are good quality results.


Advertisement

That is also one of the key driving factors, especially as we go into new domains such as single-cell workflows, or in any of those where you're trying to do multiple workflows or multiple sample matrices. I think in that case, the sample prep does play a big role in terms of the versatility that is needed by most labs, the flexibility that is needed to work with different volumes, and with different buffer schemes. Having that kind of approach really helps in scaling up the capabilities of different labs and the impact they can have.

Stephen Williams, PhD, chief scientific officer, Alamar Biosciences.

What we are learning is that proteins act in networks as patterns and therefore technologies—which can precisely and sensitively measure large numbers of proteins (1000s or many 1000s of proteins at once) and are sensitive and specific enough—are leading to a greater understanding of human physiology.

In the past, we imagined that it was going to be one protein at a time, or we hoped, but there's no such thing as a magic individual protein. It's really around networks and patterns, and the problem to date has been the large-scale measurement of such patterns.

The analogy I make is that the protein network patterns are like the internet; you need bandwidth, and you need search engines. The combination of high-plex, sensitive, precise measurement tools with machine learning and AI is going to identify the patterns themselves that can then be translated into products for health care or for clinical trials.

Yuling Luo, chief executive officer and founder, Alamar Biosciences.

I think there are two areas of change.


One area is the expansion of content, and this is what both MS technology and affinity-based technologies are doing. MS is improving to enable more and more proteins to be discovered in a variety of sample types, particularly the plasma proteomes. Similarly with the affinity-based approach, they are also increasing the number of contents they can cover, for example, Olink has moved from 1500–3000 to 5400. Improving the breadth of coverage of the proteome is certainly one of the major trends.


I think this will continue, because people always want to know what's out there.


On the other hand, there is a sensitivity improvement, like what Alamar has been doing to improve the sensitivity of the protein detection technology. Biomarkers may be present in very low abundance, but many of them have extremely high value. For example, the biomarkers for neurodegenerative disease, which, due to the blood–brain barrier, are in extremely low abundance, require a technology platform that requires high sensitivity. For applications such as early disease detection and minimal residual disease, you need to have the technology with the capability to detect those early disease markers, likely in low abundance, with sensitivity, robustness, and precision. 

Google News Preferred Source Add Technology Networks as a preferred Google source to see more of our trusted coverage.