We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

5 Key Challenges in Proteomics, As Told by the Experts

5 Key Challenges in Proteomics, As Told by the Experts content piece image

The Evolution of Proteomics featured a series of interviews with experts in the field of proteomics research. The field has witnessed major advancements in recent years, owed to the development of highly complex technology platforms such as mass spectrometry (MS) and bioinformatics tools.

Of course, when a scientific area moves so quickly in such a short space of time, it encounters challenges that must be overcome to prevent the field from reaching a "stand still". We asked each of the experts what, in their opinion, are the greatest challenges currently existing in proteomics, and how can we look to overcome them? Here are five of the key challenges they identified: 

1. Pushing for high-throughput and commercialization 

"One of the trends that is occurring in the field is people trying to come up with ways to be more efficient and more high-throughput. One of the complaints from funding agencies is that you can sequence literally thousands of genomes very quickly, but you can't do the same in proteomics. There's a push to try to increase the through-put of proteomics so that we are more compatible with genomics. One of the real exciting things in my opinion is the move of proteomics to single cell. People are finally making progress on cells that are biologically relevant, not just those that are packed with a few proteins such as red blood cells. That's going to be a great area.


One of the things that we are dependent upon in the MS field is for instrument manufacturers to keep advancing the technology. Some of the very fundamental basic research in MS takes place in academia, but really in order to make that technology useful it must be commercialized and advanced with the quality control and standards that commercialization brings to the instruments. It's always exciting when you go to ASMS to see what instruments or technologies are going to be introduced by the manufacturers."

– Professor John Yates
















2. Mastering a technically demanding field

"For a long time, MS-based proteomic analyses were technically demanding at various levels, including sample processing, separation science, MS and the analysis of the spectra with respect to sequence, abundance and modification-states of peptides and proteins and false discovery rate (FDR) considerations.  I think we are in or approaching the exciting state where these challenges are reasonably well, if not completely, resolved. When we get there, we will be able to more strongly focus on creating interesting new biological or clinical research questions and experimental design, and to tackle the highly fascinating question discussed above, how we best generate new biological knowledge from the available data. Personally, I am convinced that we will be most successful in this regard if we generate high quality, highly reproducible data across large numbers of replicates and it seems that at this time proteomics is essentially at a point to achieve this."   

- Professor Ruedi Aebersold















3.     Finding a "moon-shot" project

"The field itself hasn't yet identified or grabbed onto a specific "moon-shot" project. For example, there will be no equivalent to the human genome project, the proteomics field just doesn't have that. The "human proteome" is a constantly fluctuating information archive. Every cell type has its own unique proteome – it depends on what the function of the cell is, and it depends on what point in time you're measuring the protein content of the cell. Projects such as the Human Genome Project attracted a lot of PR and money investments for genomics, and so it is a shame that proteomics will not have an equivalent "moon-shot" project."

- Professor Emanuel Petricoin 




4. Democratizing MS

"As MS developed from a cottage industry of the 1980s and 1990s into a modern industry like aviation in 2000s-2010s, each new development required larger and larger research and development teams to match the increasing complexity of instruments and the skyrocketing importance of software at all levels, from firmware to application. All this extends the cycle time of each innovation and also forces us to concentrate on solutions that address the most pressing needs of the scientific community. 
In parallel, the increasing democratization of MS brings with it new requirements for instruments, such as far greater robustness and ease-of-use, which need to be balanced against some aspects of performance."  

- Professor Alexander Makarov
















5. Dealing with sparse data

"The major two challenges are that the data is very sparse, and that we have troubles measuring low abundance proteins. So, every time we take a measurement, we sample different parts of the proteome or phosphoproteome and we are usually missing low abundance players that are often the most important ones, such as transcription factors. 
In my group, one approach to mitigate this issue is to map the identified peptides on protein interaction networks and diffuse the signal on this network. This reduces the noise from spuriously identified proteins and enhances the functional signal. It also allows us to observe regions of the network that are highlighted by the different datasets and compare and study these, instead of trying to compare the sparse datasets between each other.  

However, with the advances in MS technologies developed by many companies and groups around the world, including the Mann group at the University of Copehnagen, Aebersold and other emerging technologies that promise to allow ‘sequencing’ proteomes, analogous to genomes, developed by the Marcotte group and colleagues, I expect that these will not be issues for very long."

- Dr Evangelia Petsalaki