Promoting the Use of AI within the Life Sciences Industry
Article Jan 16, 2018 | by Laura Elizabeth Mason, Science Writer, Technology Networks
Awareness of Artificial Intelligence (AI), Machine Learning (ML) and Neuro-linguistic Programming (NLP) is ever-increasing, with AI being one of the most talked about technologies across multiple industries. But exactly who is using it? And what are they using it for?
The Pistoia Alliance is a not-for-profit organization that aims to promote collaboration between pharmaceutical, technology, and life science professionals to lower barriers to R&D innovation by addressing challenges related to data aggregation and accessibility. One of their current projects focuses on promoting the use of AI in R&D. The Pistoia Alliance recently surveyed 374 life science professionals from Russia, China, US, and Europe, to determine the impact of AI, ML and NLP on the life sciences industry and to what extent these technologies are being adopted.
The survey results highlighted that:
• Almost half of the respondents (44%) were already using or were interested in using AI
• A lack of technical expertise was the most cited barrier to AI, ML and NLP adoption
• Current data access, quality, and lack of standardization were also considerable obstacles
“The Pistoia Alliance has launched an AI and Deep Learning community – bringing together interested parties to share their expertise, as the industry continues in its quest for new medicines that make patients’ lives better.” Explained Nick Lynch, PhD, Consultant at The Pistoia Alliance, in a recent press release.
We spoke to Nick Lynch, to learn more about the survey results and Pistoia’s future initiatives to further promote the use of these technologies for pharmaceutical R&D.
LM: Survey results showed that ‘technical expertise’ was the most cited barrier, preventing the adoption of AI, ML and NLP. What can be done to overcome this barrier and how can the Pistoia Alliance help?
Nick Lynch: The industry must collaborate to overcome the lack of technical expertise, by pooling knowledge and experience. This means collaborating with peers in science and research, but also in technology companies and academic institutions – who all have a vital role to play in realizing the benefits of AI and reducing the barriers to effective adoption. Our members represent many of the key stakeholders, but we will need to work with a wide group of organizations to support these efforts around AI.
To facilitate this, The Pistoia Alliance has launched an AI and Deep Learning Community. The community will bring those interested in AI together, to share their expertise and AI use-cases. The group will support awareness and adoption of AI by sharing best practice within life science, and will also develop and share best practice around data. This kind of collaborative initiative will help life science organizations to adopt AI more quickly, and avoid duplicating the efforts of other organizations.
LM: How important is it to standardize data format? What impact could this have on academic-industry collaboration and the accessibility and adoption of AI, ML and NLP in general?
NL: Without standardized data (and meta data), collaboration is held back because data in variable formats is not interoperable and cannot be shared between stakeholders. In addition, it takes considerable time and resources to clean, transform and convert data into a useable, sharable format. According to the Institute of Medicine, the use of common data standards in the pharmaceutical industry would reduce the United States’ healthcare administration expenditure alone by 20–30%.1
In research and development (R&D), interoperable data will hugely increase its value to researchers, and facilitate far easier, more productive collaborations. Due to this need for data standards, The Pistoia Alliance took over the stewardship of Elsevier’s Unified Data Model in 2017. The model will be developed and extended with the aim of publishing an open and freely available format for the storage and exchange of drug discovery data; much of which is classed as precompetitive and can be shared.
LM: A considerable number of respondents are not yet using AI, ML, or NLP at all, why do you think this is? What do you think could be done to promote use?
NL: The use of these technologies is still in the early stages, in part due to the challenges that our research has found. More widespread collaboration will help to find the areas where the technology will be most useful, whether it’s for drug R&D, or patient stratification. What’s more, collaboration will increase access to precompetitive and high-quality data, and allow the sharing of best practice between different types of institutions and geographies. All of this will help to promote the adoption of AI, ML and NLP across the life science industry.
LM: When considering application, the survey indicated varied use of AI and associated technologies. Why is there such disparity in adoption across different areas?
NL: Currently, the research points to the fact that most (46%) applications of AI are in early discovery or preclinical research phases. This is likely because researchers in these phases typically have access to larger datasets where AI and associated technologies can currently have the greatest impact. It may also be because of AI’s progression from Machine Learning which has had a strong usage community in Computational Chemistry for over 25 years. For instance, in the compound/lead identification phase of research, AI can augment researchers when sifting through datasets that might include; published scientific literature, internal and external clinical trial data, public datasets held by biobanks/governments, and proprietary experimental data. As adoption of AI grows though, this will change and greater collaboration and sharing of pre-competitive knowledge will uncover new use cases and applications.
Nick Lynch was speaking to Laura Elizabeth Mason, Science Writer for Technology Networks
1. McCourt, B., Harrington, R. A., Fox, K., Hamilton, C. D., Booher, K., Hammond, W. E., . . . Nahm, M. (2007). Data Standards: At the Intersection of Sites, Clinical Research Networks, and Standards Development Initiatives. Drug Information Journal, 41(3), 393-404. doi:10.1177/009286150704100313
The Quest for Exascale: Have the Goal Posts Changed in HPC?Article
The High-Performance Computing community has in recent years been dominated by the quest for Exascale systems, capable of a billion billion calculations per second. However, the interest in artificial intelligence and machine learning capabilities has pushed exascale off the front of stage. In this article, David Yip, HPC and Storage Business Development Manager at OCF, asks whether the quest for exascale is truly over.READ MORE
One Year on, Scientists Defend Canada’s Anti-Genetic Discrimination LawArticle
The Canadian Genetic Non-Discrimination Act (GNA), introduced last May, made it illegal to require individuals to disclose genetic test results or to compel individuals to undergo genetic tests for any agreement or service. However, a subsequent legal challenge has prompted a robust defense of the act from scientists in the May issue of the Canadian Medical Association Journal (CMAJ).READ MORE
Novel Collagen Membrane Improves Colon-on-a-Chip FunctionalityArticle
A microfluidic model of the colon has been built using natural tissue base matrix, which closely mimics the structure and function of the colon.READ MORE
Comments | 0 ADD COMMENT
2nd Annual Artificial Intelligence in Drug Development Congress
Sep 20 - Sep 21, 2018