We've updated our Privacy Policy to make it clearer how we use your personal data.

We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement
Proteomics Up Close: An Interview With Dr Oliver Rinner
Article

Proteomics Up Close: An Interview With Dr Oliver Rinner

Proteomics Up Close: An Interview With Dr Oliver Rinner
Article

Proteomics Up Close: An Interview With Dr Oliver Rinner

Read time:
 

Want a FREE PDF version of This Article?

Complete the form below and we will email you a PDF version of "Proteomics Up Close: An Interview With Dr Oliver Rinner"

First Name*
Last Name*
Email Address*
Country*
Company Type*
Job Function*
Would you like to receive further email communication from Technology Networks?

Technology Networks Ltd. needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, check out our Privacy Policy

Proteomics is the large-scale study of protein function, interaction and localization in health and disease. Major advances have been made within the field in recent years, owing to developments in technologies and research. We recently spoke with Dr Oliver Rinner, CEO at Biognosys, to learn more about the newest developments within the field.


Q: In your opinion, what has been the most exciting technological development in the field of proteomics over the past decade?


A: When I started as a post-doc in Ruedi Aebersold’s lab at the ETH Zurich 13 years ago, most people in the field were convinced that mass spectrometry (MS) was time consuming and purely qualitative. It involved painstaking manual inspection of spectra, revealing only the identities of proteins and not their abundance. Since then, we’ve seen a paradigm shift in the field of proteomics.

Over the years we’ve seen significant improvements in data analysis algorithms, culminating in the invention of SWATH or data independent acquisition (DIA) technology in the Aebersold lab. These techniques finally provide a way to identify and quantify up to 10,000 proteins in a single experiment with high reproducibility, turning proteomics into a highly versatile and quantitative technology with wide-ranging applications from cell signaling and gene expression to structural biology.

Q: Your website states "Biognosys believes that the decoding of the proteome will impact the life sciences more than the genome revolution two decades ago." Why is this so?

A:
Cells and organisms are more than the sum of their genes. Decades of detailed molecular biology tell us that the inner workings of cells are interconnected systems and pathways of proteins that vary in their expression level, interactions and post-translational modifications. However, the advent of high-throughput DNA sequencing has shifted focus from this higher-level functional view to a reductive gene-based perspective. Yet despite the wealth of genomic data that has been acquired, we still understand relatively little about the relationship between genetic makeup (genotype) and biological outcomes (phenotype). The renaissance of phenotypic screens in drug discovery marks a shift back to a more functional proteome-based view that considers protein-protein interactions, drug binding, expression levels and more. All these questions can now be addressed with MS based proteomics.

Q: What applications of proteomics research do you focus on at Biognosys, and why?

A:
We’re focusing on MS technologies for unbiased identification and quantification of proteins within biological samples, for example by comparing protein expression patterns or post-translational modifications such as phosphorylation between cancer cells and normal tissue or across different tumor types. Our customers are using these tools to reveal changes in proteins and pathways in health and disease, identify new biomarkers, and understand and verify the target and mode of action of novel drugs. Importantly, our methods can be used in virtually all sample types, including blood plasma, cerebrospinal fluid (CSF), cultured cells and fresh-frozen or formalin fixed tissue from any species.

Another application is targeted proteomics, which can detect and quantify a subset of proteins with high specificity and fast assay development time. This is a useful alternative to ELISA tests, which are often limited by the availability of suitable antibodies, or in situations where the target protein may be very similar to another in the cell and can’t be distinguished immunologically. Targeted proteomics assays can also be multiplexed, allowing the quantification of more than 100 proteins in parallel without significant reduction in throughput.

Our newest application, limited proteolysis (LiP) technology enables researchers to identify and validate small molecule binding targets and off-target effects within whole cells, providing valuable insights for drug discovery. Rather than measuring changes in protein abundance, LiP reveals changes in protein structures and surfaces that occur when compounds bind. Not only does this give a readout of all potential targets in the tissue of interest – whether desirable or not – it also provides information about the mechanism of action. Right now, we’re offering LiP for drug target identification and validation, but there are many more possibilities in the structural proteomics space, answering whole new class of research questions related to protein structure and molecular interactions.

Q: What challenges exist in the study of the proteome, and how do you seek to overcome these challenges? 

A:
Unlike DNA-based methods such as PCR, MS proteomics does not rely on amplification – the number of peptide fragments hitting the detector is directly proportional to the amount of that protein in the starting sample. Although this reduces the risk of introducing artefacts and contamination, it also reduces sensitivity. Given the huge dynamic range of protein levels over several orders of magnitude, especially in plasma, less abundant proteins make up a minuscule fraction of the protein mass that is injected into the spectrometer and are less likely to be detected. More sensitive instruments and better chromatography have made a big difference here. Five years ago, we could only capture around 2,000 proteins in a tissue. Today we can detect 10,000, representing around 70% of all expressed proteins in a tissue, and we expect further gains in the future.

The other challenge is throughput. Because the proteome is so complex, samples are fractioned by time-consuming chromatography prior to MS analysis. In the past, researchers have used ever longer columns connected with longer separation times in the quest to push the sensitivity to a maximum. However, the development of parallel next-generation proteomics technology means that we can significantly cut down on analysis time because the technology can deal with higher sample complexity, allowing many more experiments per day. We recently published results from the largest ever discovery study in plasma, measuring more than 1,500 samples with an optimized set-up, and we expect that running tens of thousands of samples will soon become routine.

Q: In recent years, the proteomics field has witnessed major advancements both technologically and in the applications of the research to the clinic. What do you envision the next decade will look like for the field? 

A:
I’m excited to see the field moving towards real measures of protein expression, rather than proxy measures such as genomics or transcriptomics. Similar to the rapid rise of RNA-Seq, I expect the use of proteomics to explode now that it is so quick and easy to obtain reproducible, quantified protein expression data. However, this still requires growing awareness and familiarity with proteomics data, which has been perceived as difficult to interpret in the past.

We expect to see uptake of proteomics across the whole spectrum of biomedical science, from basic research and development to clinical studies. Our recent collaboration with Roche, presented at the 2019 MSACL meeting, showed that we could successfully identify and quantify more than 9,000 proteins in 30 bowel cancer samples with high reproducibility. This set the stage for the use of next-generation proteomics in the regulated clinical trial environment. I am also convinced that within the coming decade, proteomics will be used as a clinical diagnostic tool, initially probably focused on a limited number of biomarkers but later expanding to more general method in conjunction with developments in artificial intelligence and machine learning.

From the technical side, I believe that our LiP technology marks the beginning of proteomics as a powerful new tool for structural biology. The ability to analyze changes in protein structure on a global level may lead to a new view of how proteins interact in cells, providing a more informative view of pathway activity and dysregulation compared with phosphorylation changes – a concept proposed by Paola Picotti at the ETH Zurich. With the improvements in depth and resolution that we expect to achieve with LiP in the near future, I am hopeful that these new analytical technologies will drive yet another paradigm shift in the field of proteomics.

Dr Oliver Rinner was speaking to Tiffany Quinn, Custom Content Coordinator for Technology Networks.

Advertisement