The Evolution of Software as a Service (SaaS) in the Laboratory and Beyond
Complete the form below to unlock access to ALL audio articles.
Over the last 10–15 years, software companies have overwhelmingly adopted the model of cloud-hosted, pay-as-you-go, software as a service (SaaS). There are multiple reasons for this adoption. Software companies needed to keep up with technological changes that enabled more services to be streamed. SaaS brings software vendors reliable revenue from consumers who now pay monthly or yearly for something they used to purchase once or twice a decade. Consumers get the flexibility of a more customizable service. Simultaneously, the vendors’ cost of goods sold decreases because they no longer have to pay for physical disks, packaging or shipping.
The adoption of SaaS in laboratories brings benefits for the end users as well. What is lost in ownership may be compensated by gains in efficiency or by offsetting capital expenditures for IT, infrastructure and cybersecurity support. The bundling of systems in a cloud environment allows costs to plummet and enables smaller organizations to adopt them.
It’s important to note the distinction between cloud technology and SaaS. Although all SaaS is cloud-based, cloud environments also include infrastructure as a service (IaaS) and platform as a service (PaaS), among other more niche models. The cloud is there in the background, but SaaS is not the cloud.
A Brief History of SaaS
So, what is SaaS? SaaS applications began as a natural progression from the earlier model of local area networks (LANs), which developed in the late 1960s. In a LAN, a powerful mainframe computer or server holds independently packaged software for access by terminals throughout an organization. The software on the physical server could be purchased by the organization and run from the server until it was obsolete.
This model of software hosting quickly became inefficient at scale. Software programs simply outgrew on-premises LAN servers as processors became more powerful and the programs themselves more complex. In 1965, Gordon Moore posited that the number of transistors on microchips, and therefore processing power, would double roughly every two years. SaaS became the answer to processing needs.
The remainder of the 20th century was a golden age for SaaS. Software and the servers they ran on became almost unimaginably powerful and generated increasingly complex data (that, of course, needed to be stored). The growth of processing power continued at a slower pace into the early years of the 21st century, and although advances in hardware design arose simultaneously to further extend the limits of computing power, Moore’s Law is now dead.
Distributed computing with purpose-built processors has been a viable near-term solution. However, it won’t be able keep up with the insatiable demand for data. For now, much like the Wizard of Oz, I must ask you to “pay no attention to that man behind the curtain!” The answer to what happens when computer hardware no longer keeps up with data demands is a story for another time! (Spoiler alert: It’s quantum computing.)
SaaS Applications in the Lab
Laboratories have not been immune to SaaS adoption. Laboratory budgets are increasingly lean; existing personnel must do more with less. The possibility of removing some infrastructure burdens is an attractive proposition. For that reason, the major laboratory information management systems (LIMS) vendors all offer SaaS options, and there are some LIMS that are offered solely as SaaS. Validation of cloud-based laboratory software enables these systems to be used in regulated environments.
LIMS is not the only laboratory software making the move to SaaS. Chromatography data systems (CDS), electronic lab notebooks (ELNs), and lab automation or connectivity software are also common SaaS offerings. Much of the reporting and data analysis can be done remotely when applications are hosted in the cloud. In the early days of the COVID-19 pandemic, such applications enabled organizations to reduce the number of staff on-site in the lab, for everyone’s safety.
Storing laboratory data in the cloud has opened the eyes of organizations to the possibilities of unlocking the business value in that data. Laboratory data is no longer stored in a dusty notebook in a warehouse. Anyone with the proper credentials can access the data and use it to solve problems and design new products.
However, access to more data increases the entropy of the system. Imagine that the second law of thermodynamics is a cat, and the possibility of unlimited cloud storage is a bag of catnip. You can begin to understand how the volume of laboratory data has exploded since SaaS became a common model. As the data expands, the capacity of SaaS solutions has, so far, expanded with it.
Labs today store data in LIMS, ELNs, or CDS. If these systems are interfaced at all, they are connected in a predefined, structured way. To make the best use of laboratory data, an unstructured data environment (a lake or warehouse) is preferable. In response to this big data expansion, organizations are developing advanced data and analytics capabilities to work with their data.
The common tools for exploring the big data sets that SaaS can store and output are artificial intelligence (AI) and machine learning (ML). The life sciences, in particular, are turning to these tools to deal with genetic data sets and large-scale clinical trials. These tools are also becoming more prevalent in engineering design, environmental monitoring and oil and gas exploration, just to name a few.
Future Directions for SaaS in Labs and Across Organizations
More organizations are taking advantage of specific applications of SaaS. Backend as a service (BaaS) will continue to expand, providing building blocks for new applications. Containers as a service; desktop as a service; environment as a service – you get the idea; there’s an alphabet soup of potential cloud services with ample room for growth.
Cloud-based integration platform as a service (iPaaS) offerings will enable companies to connect all their various instruments, applications and informatics systems to a single data source in the cloud. These platforms can allow seamless data sharing in real time across multiple sites, enabling more efficient research. Machine learning as a service (MLaaS) will allow more organizations to access unknown insights from their big data sets.
Proliferating cloud services naturally demand better cybersecurity. You can find data to support the arguments that SaaS is either more secure or less secure than on-premises solutions. To understand the arguments, it may help to think about this issue in terms of the difference between privacy and security. Security is about protecting data from being stolen; privacy is about using that data responsibly. On-premises systems ensure privacy because your organization retains control of the data; at least until the system gets hacked. Cloud-based systems require some loss of privacy because your data is now stored on someone else’s server. But cloud-based systems may have higher levels of security (although they are not immune to hacking). An organization’s tolerance for both privacy and security will vary, sometimes across business units.
What does the future of SaaS look like? It’s impossible to know with certainty. But what does seem certain is that SaaS is here to stay – in the lab and throughout organizations.
About the author
Becky Stewart is the scientific copywriter for CSols Inc.