We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.


Creating Space for Innovation in Lab Informatics

Listen with
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 3 minutes

A clear undercurrent at this year’s Smartlab Exchange conference was about a pragmatic approach to lab informatics. It’s clear the laboratory informatics space is continuing to evolve. R&D companies and vendors alike are attempting to navigate this as best they can while balancing an ever increasing set of competing priorities. Companies are seeing past the hype and marketing and want to get into the details. Informatics teams are pushing back when scientists want every shiny new tool on the market, and business analysts are attempting to sort out the fluff from the value.

Technology is expected to make laboratories more efficient, but as Michael Elliot, CEO at Atrium Research and keynote speaker pointed out 80% - 95% of current budgets are spent on simply maintaining existing infrastructure. When companies are spending so much effort to maintain the current status quo, that doesn’t leave much room for innovation. Yet the crowd at Smartlab was there to discuss just that. But what should they focus that small slice of precious time and budget on, that will be most impactful to their business? Some key themes from the conference are discussed below.

Systems in the Cloud

Cloud, SaaS and the slew of other related acronyms are not new topics by any stretch. It was clear from the Allergan presentation and the Cloud ELN roundtable that cloud-based systems are providing solid ROI and time to value when compared to traditional on-premise systems. However, these come with their own set of nuances and challenges to navigate, which require special attention. Understanding the various certifications, technical aspects, logistics, and even the lingo can seem daunting (what’s the difference between IaaS and PaaS… anyone?). There is still a lot of concern and confusion in the industry, but we’re progressing.

Integration into the Informatics Ecosystem

Integration continues to be a focus. Users are increasingly expecting that their labs will work like a smart home. Not just so they can play with something cool, but the lab Internet of Things does have the possibility to provide serious value for R&D organizations. As instruments become more connected and enterprise systems are tied together, the possibilities are nearly endless. But deciding on a pragmatic approach is critical. Prioritization of integrations based on providing the most business value with the least effort/cost is necessary, otherwise it can be easy to get distracted and end up with little to show.

Integrations can provide value for getting data into corporate systems, automating manual tasks, and bringing data together through different lenses for analysis. But the current landscape is a lot like a children’s shape sorter: many different pieces that only fit into their specific slots. The standards discussion highlighted the complexity in this space; there are as many standards as there are tasks to solve. Different problem areas have standards of differing maturity and market adoption. It’s hard to keep all of the standards straight at times, so companies must evaluate their business problems and the ability of the existing standards to address them. Then focus their attention on those select few while the others continue to evolve.

Big Data to Big Analysis

Nobody is talking about big data anymore. That seems to have been solved already, even though many companies admittedly aren’t doing it well yet. According to Eric Little of Osthus the topic now is big analysis, because all big data does is take up disk space (albeit likely distributed and in the cloud). Big analysis is all about what you can do with your big data. If the data is Brent crude, the real fun lies in turning that into high octane petrol. There are two approaches to this that were discussed: technology and people. The technology approach leverages master ontologies, semantics, machine learning and the like to make sense of the data programmatically. The people approach is all about, well… people! Namely, data scientists, to make sense of the data, using understanding and context.

This is another area, however, where there is no clear path to value. Companies see opportunity in it, they have business problems that may be solved by it, but putting that into practice is still a bit of trial and error. The real trick will be finding the right combination of tech and people, and what tasks each should be applied to. Just as in any other project, showing quick time to value and solving real business problems will pave the path to success. But with big analysis it can sometimes seem like the ocean (or lake in this case) needs to be boiled.

Smartlab Exchange was filled with brilliant people and lively discussions. The problems these teams are tackling are massive and wooly, but extremely valuable to the businesses they support. It’s an exciting time to be involved with R&D informatics, and Smartlab Exchange was right in the thick of it.

As an R&D informatics vendor, IDBS specializes in helping companies navigate this complex data landscape.