Many analytical laboratories are on a digital transformation journey. The promise is clear: overcome fragmented systems, disconnected workflows, and entrenched data silos, and the result will be a fully integrated digital lab. But what does that future state really look like – and how far are we from achieving it?
For many organizations, it is hard to see past the remaining obstacles. Instrument data is rarely standardized, even between models from the same vendor. Legacy systems, never designed for interoperability, are difficult to connect based on their technology stack and/or software architecture. And beyond the technology, cultural barriers persist, such as senior leaders being hesitant to abandon familiar processes, or scientists not wanting to share raw data they consider “theirs.” Together, these factors perpetuate silos.
Although some labs are struggling, others are demonstrating what’s possible. With advanced orchestration systems, labs are beginning to unify software, hardware, and automation into workflows that feel genuinely seamless. These examples offer a glimpse of the true potential of the digital lab.
What’s missing, however, is a conceptual framework to make sense of this journey – one that defines the end goal and shows how the pieces fit together. That framework is the Digital Laboratory Operating System (Digital Lab OS): a way to understand the integration of software, robotics, and informatics into a single cohesive ecosystem.
Digital Lab OS is analogous to a computer operating system, which abstracts from the hardware so that as a user, you only interact with the applications that run on top of it. Do you care what processor is in your laptop? Probably not – as long as the processing power is sufficient to do what you want to do. In the same way, analysts aren’t wedded to particular instruments and aren’t interested in managing different data formats, provided they can answer the questions they want to answer: how pure was my compound? What’s the impurity level? This is the level of abstraction we should be aiming for.
Digital Lab OS is also about speed and repeatability. For example, scientists are often asked to deposit their HPLC data into a shared folder, which they then have to retrieve, once the experiment is over. There will be many such folders, for different instruments and experiments, and often going back many years. Potentially, Digital Lab OS could help to manage back-end processes so that researchers can more quickly get to the insight they need from their data – and in a way that is repeatable. This is a real value-add.
It also means you don’t end up with a situation where individual labs – and even individual scientists – are stuck maintaining older data workflows. For example, we’ve seen a situation where an individual has to spend a meaningful percentage of their time maintaining a legacy application for data extraction and manipulation they had written ten years earlier. Nobody else knew how it worked, so even though their responsibilities had shifted over time, they were still stuck supporting it! This example demonstrates how customized – and fragile – data flows can be.
Digital Lab OS in action
Digital Lab OS isn’t just an abstract idea. The building blocks already exist in today’s informatics and automation solutions – and are beginning to make this vision tangible. Thermo Fisher™ Connect Enterprise Platform, for example, is designed to surface data from multiple sources into what you might call a “data fabric” that ensures the right information is available to the right scientist at the right time.
To illustrate how this works in practice, let’s look at a typical day in a modern laboratory. A scientist might begin by setting up an experiment in their electronic lab notebook (ELN). From there, they may need to switch into a separate laboratory information management system (LIMS) to manage samples, or consult a learning management system (LMS) to confirm which processes they are authorized to run. At the same time, a backlog of tens of thousands of samples might sit in a repository, requiring orders for new plates to be prepared.
On top of that, execution scientists often interact with robotics platforms – which come with their own dependencies. Does the system have the reagents and consumables it needs? If not, how quickly can replacements be ordered and delivered? Additional platforms may be managing reagents on deck, tracking expiry dates and preparing conditions to ensure everything remains valid for use.
Then comes data generation. Ideally, much of this could be automatically reviewed, with results triaged into those that pass without issue and those that need human review. Some samples may even need to be rerun or processed differently, creating loops in the workflow. Meanwhile, environmental conditions – such as incubator temperature during the precise minutes a sample was stored – must be logged for compliance purposes. If an auditor revisits the experiment a year later, every step needs to be traceable and reproducible.
In other words, today’s labs are incredibly complex environments, with multiple processes, instruments, and data streams all happening simultaneously. Without orchestration, scientists are left to navigate this maze themselves – switching between applications, manually moving data, and piecing together provenance records.
The Connect Enterprise platform can orchestrate all of these disparate elements into a single, coordinated workflow that links third-party LIMS, ELNs, and LMS platforms. The platform also integrates hardware, which can be achieved using tools like Thermo Scientific™ Momentum™ Workflow Scheduling Software to coordinate a dozen or more automated work cells. Critically, this allows for data integration – harmonizing different data formats and ensuring information flows seamlessly across systems.
The result is a lab environment where scientists no longer have to think about which platform to log into, which work cell is available, or which data format they’ll need to wrestle with later. Instead, they can trust that the Digital Lab OS is managing these complexities in the background, allowing them to focus on what matters: running experiments, interpreting results, and driving science forward.
This way of operating a lab, using the Connect Enterprise platform, is genuinely new – the breadth of orchestration we’re driving here has never existed before. We believe this is the beginning of the true “lab of the future”: an automated digital lab where informatics and automation platforms – from robotic movers and automated incubators to advanced orchestration platforms and cloud-based tools – converge into a Digital Lab OS that unlocks the full potential of people, data, and processes to accelerate science at scale.