The requirements for long-term retention and access to research data can be for decades or longer. Stakeholders (archivists, librarians, repository managers and compliance officers) and funders are frequently raising up new data archiving requirements where commodity IT storage solutions and services often don’t meet these requirements, which can leave a challenging gap for IT departments to bridge.
Data archiving can be an effective long-term way to manage growing volumes of research data, but careful planning and implementation is needed to ensure that a data archiving strategy provides assurance that there will be no long-term data loss, and still allows data to be accessed easily when needed.
In the webinar, we:
- Review the good practice that data stewards increasingly expect (NDSA levels, Core Trust Seal, ISO 16363 and the new Preservation Storage Criteria).
- Look at the basics of data safeguarding (geo-replicated copies of data, chain of custody, data integrity management, security and access control) and how this forms part of good practice.
- Discuss how data safeguarding of research data can be delivered effectively by IT departments through hybrid solutions, data escrow, hosted and cloud-based archiving, and managed services in a way that is cost-effective, transparent and auditable.
About the presenter:
Dr Matthew Addis is our Chief Technology Officer and co-founder of Arkivum, responsible for technical strategy.
Matthew previously worked at the University of Southampton IT Innovation Centre. Over the last fifteen years, Matthew has worked with a wide range of organisations in the UK, Europe and US on solving the challenges of long-term data retention and access.
For more information, feel free to contact our friendly data experts here.
28 Nov, 2018
5 benefits of safeguarding your research data management
17 Dec, 2018