What do we mean by decommissioned systems and what are the challenges?
Decommissioned systems have been a common fact of life in all technology, be it laboratory instruments or software applications. As part of a product’s lifecycle the manufacturer plans an end of life where the product and/or application is no longer supported. This is for a variety of reasons, though primarily because of the exponential rate at which technology is evolving and the industry demands to deliver new capabilities and use cases. This is true of both the instruments that are being used to collect research data and the applications that are capturing such data.
When it comes to research data for clinical trials or in fact most scientific data, there is often a mandate as to how long you need to keep the data for. If we take a clinical trial as an example, you may need to keep the data for 30 / 40 / 50 years after the trial period has ended. You may already be using an instrument that is 10 years old at the time of the trial which in turn means that by the end of the retention period, the instrument will be circa 50/60 years old and certainly out of support from the manufacturer. The application used to gather the data will very likely running on an old operating system which is also out of support from Microsoft/Apple/Other OS creators.
This often results in the machine that the instrument feeds data into being disconnected from the organisation’s network by IT departments as they no longer comply with organisational information security policies. The hard drives may be removed and stored physically in a storage cupboard.
This means that valuable data is at risk of loss, corruption and lack of accessibility. Often a lab will have a SDMS (Scientific Data Management System) and will assume that this solves the issue however it very rarely does. Here’s why.
What does an SDMS do?
Laboratories will often have an SDMS to capture data from research and clinical trials, linked with a laboratory’s LIMS (Laboratory Information Management System) it is the cornerstone of the running of a laboratory’s day to day activities.
It is essentially a piece of software that acts as a document management system, capturing, cataloguing, and archiving data generated by laboratory instruments and applications. An SDMS works as a data warehouse with a pre-defined metadata schema, capturing data in order to refer back to a specific result of a clinical trial or research project.
Why is it not good enough?
In a laboratory scenario you are bound to have many instruments and many types of data. Due to the pre-defined schema within a SDMS they often struggle to deal with the heterogenous data sets that are created by all of your different instruments (even if you only have two), it can therefore only deal with a set of metadata that is pre-defined by the SDMS manufacturer. Each instrument has a different data type, multiple systems used to access it and numerous rules and standards to apply.
If the original instrument has been disconnected from the organisation’s network and you want to be able to access the raw data, lab technicians often have to refer back to the original instrument to gain access, sometimes writing down the information that they need. This causes a myriad of issues and challenges, from siloed data to regulatory compliance in terms of auditability.
How do I meet regulation and achieve innovation?
The life science industry is a highly regulated one, and accessibility to usable data is a must in terms of being able to refer to data when responding to regulatory audits or investigation of results in a clinical trial. Not only is this essential from a compliance point of view, it is critical to an organisation’s ability to innovate and create competitive edge in this highly challenging industry.
In order to comply with MHRA data integrity guidelines and GxP principles you need to break down data silos and create a single view, such as a heterogenous data lake.
How does Arkivum remove this problem?
Using Arkivum Trust you can prevent data loss due to decommissioned applications and systems. Extract data from legacy systems so you can use it again and again as you need to. Arkivum solutions ingest data from any source and multiple file types, and its vendor neutral technology means you can future-proof against EoL issues.
19 Nov, 2018
How University of Nottingham makes its world class research data secure, usable and accessible for decades to come
After initially becoming an Arkivum client last year to manage 10 TB of their long-term data, the University of Nottingham worked with us to expand the coverage…
28 Nov, 2018
5 very real benefits of safeguarding your research data management
Protecting research data through data archiving and safeguarded storage is important for ensuring that research is accessible, reusable and preserved into the long-term. Archiving and storage is…
21 Feb, 2019
5 Questions to answer when building your Cloud strategy
Recently, Gartner provided its view on the 5 questions to answer when building your cloud strategy. In response to this, we thought you would find it useful…