An introduction to the FAIR data management principles - Arkivum

Best practice approaches / 28 Mar, 2022

An introduction to the FAIR data management principles

In a world of ever-evolving technology, it’s sometimes easy to forget the basics.  

As individuals and organisations chase the latest tech to realise (often) small and incremental gains, it’s important to ensure that the fundamental structure, processes and technology are in place. If not, then it’s likely the real value of that newer technology will never be fully realised. 

 There are few business areas in which this is truer than how you manage your data.   

“Data is a precious thing and will last longer than the systems themselves.” 

Tim Berners-Lee

The case for best practice approaches 

They may not be the most exciting thing for organisations to focus on, but best practice guidance and approaches should form the basis of every business. While every sector and organisation will undoubtedly have differences and nuances, there are also many shared challenges to overcome.  

Best practices are often developed by groups of experienced individuals within a subject matter area or sector – they are written with the collective experience of thousands of hours of experience…and they are usually regularly updated to stay relevant and adapt to new working practices and technology. 

This is why at Arkivum we place great importance on data management best practice approaches, such as the FAIR data management principles. 

What are the FAIR data management principles? 

 The FAIR data management principles were created by the scientific community to support discovery through good data management 

 The principles focus on four main areas for effective data management: 

  • Findable 
  • Accessible 
  • Interoperable
  • Reusable 

It’s also worth stating that they also refer – or apply across three main entities, the data being stored, the metadata (I’ll cover this later) and the infrastructure (or supporting technology). 

The principles were first published in the journal Scientific Data in March 2016 and although they were primarily developed for scientific research data, I would argue that they can easily be applied to any organisation. 

Every organisation is generating data of some kind, but few truly consider its long-term management. The use of principles such as FAIR, can help provide a strong foundation to ensure that their long-term data can be used in the future. 

A detailed look at the FAIR principles 

On the face of it, the principles are fairly straightforward but I think it is worth exploring each in a little more detail: 


There’s no point storing data for any period of time if it cannot be found. A key component of ensuring that date is findable is through the use of metadata i.e. data about data.  

Metadata can come in many forms such as the date a file was created, when it was last modified or even referencing elements of what is in it.  

We recently published another blog post on metadata if you are interested in reading more on the topic.  


Once a user has found the right file, it’s important to ensure that it is also accessible to them. This means that those with the appropriate authorisation can access the file or perhaps even download it to another system.  

Within a larger repository or archive, this might see different users with different access levels – but it is crucial that the right people have access to the right data. 


As organisations adopt more and more digital technology, a growing issue is the number of different systems in which that data resides. The concept of interoperability is to ensure that data (where possible) can easily move between systems – while also ensuring that no information (including metadata) is lost. 

This can cause many challenges including having different systems talking to each other in order to move data, and systems recognising data generated in another systems.   


There is little point in being able to find and access data if it cannot be opened, read and used by the user accessing it. Without future use being guaranteed, then there is little value in keeping the data at all (optimising the reuse of data is the ultimately the main goal of FAIR).  

The principle also makes it clear that, ‘metadata and data should be well-described so that they can be replicated and/or combined in different settings’. 

In summary – FAIR data in the real world 

I hope that you’ve found this post useful for providing an introduction to the FAIR data management principles (and best practice approaches in general). Best practice approaches might not be the most exciting topic, but I promise that they will lay a strong foundation for long-term business success, both for your data management and beyond. 

If you are interested in finding out more about the work Arkivum do in applying the FAIR data management principles in the real world, you can read about our work within the ARCHIVER project 

The project is focused on developing digital preservation services for the European scientific research community, in line with the FAIR data management principles. It’s a fantastic example of FAIR data in action! 

Tom Lynam

Tom is the Marketing Director at Arkivum. He joined the business in January 2020 tasked with driving new business growth and building the brand into new sectors such as Pharmaceutical and Life Sciences. He has over 12 years’ experience in several diverse marketing leadership roles across technology and professional services organisations.

To receive our latest news and blogs straight to your inbox, please enter your email address.

Follow us on