What is digital data preservation and why do I need it?
Digital data preservation (also known as digital preservation) is an ongoing process to ensure your data remains accessible over time, regardless of changes in file formats and technologies.
There are two key aspects to this. One, is about storing your data in a way that guarantees access far into the future, regardless of any changes in technology or organisations ceasing to exist. The second aspect is much newer as it has previously been very difficult to provide technically, this is usability of your data far into the future. Read more about what digital preservation is here.
If you think of it in regards to meeting ALCOA+ principles, how can you prove your data is Original and ensure it is Legible when it is in an old file format? Digital preservation is an active process that ensures your data including audit trails and metadata is accessible and usable into the long term.
Why is digital preservation so important?
We’ve seen many organisations unsure of the most appropriate way to archive their electronic records and they think that simply ‘archiving’ them within live systems is sufficient.
This creates a huge problem as the sheer volume of data you are dealing with is always increasing exponentially and is often held in different systems and sometimes in different organisations. Consider that a typical eTMF for a study containing scanned pages can grow to be in the region of one to two terabytes of data. Now add retention timescales to this dynamic and you can see how this problem can quickly get out of hand. Now imagine if the PDF version that you have saved the notes in becomes obsolete by the time you need to access it 50 years down the line.
This is something that Eldin Rammell, Director, Expert Solutions at Phlexglobal discussed with us recently in his blog and webinar that he held on the topic. You can read and view here.
Questions you need to consider…
Because digital preservation is often not something that life sciences organisations consider, there are a few questions you should think about:
- How would you find records that are 7/15/25 years old in your organization today? Unfortunately, this is unlikely to be a simple search, especially when you consider these records are likely to be across a mix of paper AND electronic documents.
- Can you access them? Is the system used to generate the data still in use?
- Do you know who / what the data relates to? Is there personally identifyable data in any of those files, do they have a retention policy, where did the data originate from?
- Can you prove provenance? How can you guarantee that they are the original copy of a piece of data?
For one piece of data, or a set of data these questions should be fairly easy to answer, but consider that your data is ever growing, then imagine a surprise inspection on the back of a 483.
GCP Inspections – Is your data inspection ready?
In the recent MHRA GCP Inspections report, some interesting critical findings were raised including under-reporting of SUSARs to the competent authority and inaccurate information being submitted in DSURs; incomplete TMFs being presented for inspection and supporting documents maintained in different, non-compliant, systems. An inability to find and access documents in the eTMF, incomplete documents and missing critical information, all contributed to a commercial sponsor receiving a critical finding for not keeping their data in an inspection ready state. No appropriate system checks to ensure their systems processes are working as expected, including checks on their automated report submissions resulted in the same sponsor receiving a second critical finding for pharmacovigilance.
During one CRO’s inspection, the TMFs selected for inspection were found to be significantly incomplete, to such an extent that the trial conduct could not be reconstructed, and the inspection had to be extended. This was found to be a systematic issue, with the eTMF being considered and used as a final document repository rather than a contemporaneous system used to manage the trial.
There is an answer
I promise that it is not all doom and gloom! Join George Waidell, Global VP of Life Sciences for a webinar on December 11th, in the webinar George will provide insights into some of the challenges and solutions to this compliance hurdle:
- What is digital data preservation and why do I need it?
- eTMF Archives and the EMA guidance
- GcP inspections; is your data inspection ready?
01 Apr, 2019
Eldin Rammell, Phlexglobal – Archiving GxP Data – Is it actually rocket science?
Just how difficult can it be to archive electronic data generated from a drug development project? Surely it is enough to create an “Archive” folder on a…
16 Jul, 2019
A Fortune 500 Life Sciences enterprise chooses Arkivum to provide long term management of its GxP regulated research data
A Fortune 500 Life Sciences enterprise chooses Arkivum to provide long term management of its regulated research data generated by its FDA Class III medical devices. Arkivum…
23 Jul, 2019
Arkivum demonstrates alignment with regulated Life Sciences data management requirements including GxP regulations, award of ISO 9001 and ISO 27001 certifications and successful customer and partner audits
Arkivum, the leading provider of long-term data management and digital preservation solutions, today announced it has achieved pharma-level compliance with the award of ISO 9001:2015 certification, demonstrating…