Just how difficult can it be to archive electronic data generated from a drug development project?
Surely it is enough to create an “Archive” folder on a secure network drive and save it there? So long as access is strictly controlled, that is OK, right?
These are questions I hear quite frequently from colleagues in the pharmaceutical and biotechnology industry. Unfortunately, there is still a lot of misunderstanding on this topic, even down to what the term “archive” actually means; in the IT world, it is often synonymous with data back-ups. When we’re talking about data that is governed by GxP regulations, archiving is very different from IT back-ups. The regulations that our industry are obliged to follow have identified some very specific requirements in relation to archiving. Some of these apply only to archiving of electronic data (such as the requirement to address software and hardware obsolescence) but most apply to all data, irrespective of the format or storage media.
We see lots of problems when electronic records that should be archived are simply held within live systems.
The relevant regulations do permit archiving in live systems, such as an eTMF application, but only if certain conditions are met. There are several disadvantages with this approach however.
One is the problem caused by the sheer volume of data. An eTMF for a study containing scanned pages can grow to be a very significant volume of data….somewhere in the region of 1Tb -2Tb of data for study if storing original TIFF files and JPG files. As each new study is initiated, this data store gets added to and grows year on year. With retention time of up to 25 years, you can see that within only a few years, you will have many terabytes or even petabytes of data stored online. Even though the studies may be tagged within the system as being ‘archived’, the data is still held on your live servers, subject to backup and processes that all of your live data is subjected to. You can imagine that this might be an impact on system performance. The software may be sophisticated enough to exclude archived data from certain search tools but a system that holds only ongoing and recently closed trials will be more efficient than a system that also hold trial content from the last 10-20 years.
Now imagine you have a system upgrade…..
An activity that may happen 3-4 times each year for some systems. Each time the system is upgraded, the underlying database content needs to be migrated to the new version. Oftentimes, this will not be problematic, but sometimes the current data needs to undergo an update to be compatible with the new version. There is no problem with this per se, so long as the changes are documented and the migration follows a validated process. However, the migration of several hundred terabytes of records could be more problematic than migration of only your current data.
Another potential issue we see is the risk to your archived records if held on live systems.
They are typically ‘locked’ if in an archived status but nonetheless they still have the potential to be accessed, if only by personnel nominated as archivists and by system administrators. Any system error, malfunction, or accidental or deliberate failing of the system has the potential to affect archived data. Archived data held in separate archive storage systems are less prone to the day-to-day risks that affect our line of business systems.
Finally, for organizations that are subject to GxP regulations, it is necessary for archived data to be under the control of a named archivist.
It is not impossible to achieve this where archived data is held in live systems but it is more difficult to achieve. This is especially so when all users still have access to the original content that they had access to before the data was tagged as ‘archived’, though the access level may have been changed to read-only. An archivist, according to the GxP regulations, is designated by management to be accountable for the day-to-day management of the archive, including the operations and procedures for archiving, and for ensuring the ongoing accessibility, preservation and integrity of archived records. This is easier to achieve and to demonstrate compliance in situations where archived electronic records are moved out of live systems at the time of archiving and stored securely in an electronic environment that is more suitable for long-term retention….. in exactly the same way that we provide specific archive solutions for our paper records, under the control of the archivist.
So what is the recommended approach to archiving electronic regulated records?
View the webinar recording now ‘Regulatory compliance for data archiving – what’s expected’ where we discussed specific solutions and the best way of complying with all of the different regulatory requirements.
About the author
Eldin is Director, Client Solutions at Phlexglobal Ltd. His role includes advising clients on trial master file strategy, processes and technology optimisation. In addition, he drives thought-leadership to further advance Phlexglobal’s industry reputation and position. Prior to joining Phlexglobal, he was a freelance, records management consultant for nearly 15 years, following a 17-year career as an archivist and records manager at Glaxo (now GSK) and Pfizer.
He has a very broad records management experience within life sciences, including GxP-regulated functions (pre-clinical, clinical & manufacturing) through to non-GxP functions such as human resources, finance and facilities management. He is active in several industry groups including the DIA TMF Reference Model and is current Chair of the Health Sciences Records & Archives Association (formerly called Scientific Archivist Group). He enjoys presenting and is a regular speaker at various industry conferences. He is originally from Birmingham and now lives with his family in Worcester, UK.
28 Aug, 2018
Launch of the eTMF standard unlocks easy file sharing and collaboration
21 Sep, 2018
Challenges with metadata – simplifying complexity for Life Sciences data
12 Dec, 2018