We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Egg on Our Face: Another Asilomar Moment?

Egg on Our Face: Another Asilomar Moment? content piece image
Credit: iStock.
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 3 minutes

The following article is an opinion piece written by Michael S Kinch. The views and opinions expressed in this article are those of the author and do not necessarily reflect the official position of Technology Networks.


Nearly two decades ago, I started teaching a course on biomedical communication at Johns Hopkins University. I began each semester with a simple question: is eating eggs good or bad for you? This innocuous question nearly always triggered a raucous debate, pitting an anti-cholesterol faction against those advocating a protein-rich diet. To fuel the dispute, I projected a slide full of contemporaneous headlines, including from the American Heart Association, which supported either and sometimes both sides.

 

After a half-hour of robust conversation, the class would invariably conclude that the scientific community simply did not know what it was talking about. In a nod to Einstein’s Parable of Quantum Insanity, I have been asking the same question for 20 years and getting the same results ever since.

 

I recently read – with concern – a story that has been circulating within the scientific community; one perceived as reckless behavior by many, and by others equally worrying due to the potential for a Draconian backlash that could stifle open scientific investigation.

 

In a nutshell, investigators at Boston University engineered mutant forms of SARS-CoV-2 and tested their virulence in mice. These studies were apparently conducted safely, though it soon transpired that the investigators had failed to seek permission or guidance from the National Institute of Allergy and Infectious Diseases (NIAID) before launching upon their investigation. This rogue behavior (as believed by some) raised concerns about the potential for a lab leak (exacerbated by the ongoing debate about the origins of the virus) and the fabrication of a potential Andromeda strain.

 

While the finger-pointing continues, the subject is hardly new as previous questions and concerns have arisen from similar studies of Spanish influenza, smallpox and a host of other infectious pathogens.

 

The usual arguments surrounding the need for changes in the ethical and practical boundaries for scientific investigation have been renewed. Such urgings induce adamant responses, such as one from the University of Saskatchewan’s Dr. Angela Rasmussen, who proclaimed, “I’m very tired of people suggesting that virologists and NIAID are reckless or don’t care about biosafety.”

 

Yet the public is arguably more prone to this perception than it has been in decades.

Positive perception does not hold sway in today’s public opinion

As a chronicler of the discovery of new medicines and vaccines over the past two centuries, I contextualized this incident from a historical perspective. My research on these subjects has consistently revealed deep cycles in public perceptions of science and scientists.

 

As a reader of this publication, the odds are that you are positively inclined towards and well-versed in the promises of science and technology. You might have even been inspired by the possibilities of science as proclaimed during peak periods, such as the International Geophysical Year (IGY), which began in July 1957 and included the launch of Sputnik and the resultant Space Race), or the dawning of the Age of the Internet as technology began to change everyday life.

 

Despite the many achievements made possible by science and technology, a positive perception largely does not hold sway in today’s public opinion. Looking to an earlier age, the enthusiasm surrounding IGY likewise quickly faded. Within a decade, exuberance had given way to primal fear of science and technology as evidenced by the widespread popularity of dystopic predictions in Alvin Toffler’s Future Shock and Michael Crichton’s work, which included the aforementioned Andromeda Strain and Jurassic Park (the latter of which ironically evolved from print to entertaining the masses in the 1990s as a consequence of technological advances).

 

The nadir in public perception of the early 1970s of scientific thought was not merely esoteric. Fear of technology delayed and nearly scuppered recombinant DNA research. Long forgotten are the public pressures that led to a halt in gene-based research. Fortunately, this was only a temporary respite as the 1975 Asilomar Conference resulted in a decision by scientists elected to police themselves rather than have it imposed upon them. The advances thereafter propelled new improvements in medicines, including our ability to develop safe, effective and timely vaccines for COVID-19.

Responsible in conduct

A casual reading of the current headlines suggests a nadir in scientific popularity that resembles the early 1970s. Stories abound about skepticism towards climate change, vaccines and other technological fears. While some concerns are certainly reasonable, today’s dystopian interpreters have technology on their side via forums that can amplify mis- and dis-information and are not subject to editorial scrutiny. Rumors can run rampant in nanoseconds and be rendered resistant to fact-based rebuttal just as quickly.

 

Compounding the situation, both responsible and reckless news outlets are persistently trolling for sensational headlines and thus are ready to pounce upon self-inflicted wounds such as the recent revelation of experiments with engineered SARS-CoV-2 are grist for the mill.

 

Given these perils and the climate in which we find ourselves, it is incumbent upon the scientific community to be particularly responsible in their conduct. Scientists find themselves amidst a minefield of public opinion, seemingly ready to trigger an explosion regardless of which way to turn. As we did during Asilomar, scientists need to identify ways to police ourselves and to responsibly manage both scientific investigation and the reporting thereof.

 

Science and reason will ultimately prove to be the losers if the public, like my students at Johns Hopkins, continues to conclude, “they simply don’t know what they are talking about.”