GLP…three letters which have a big impact across how lab data is handled and retained.
These principles provide the industry’s guidelines for ensuring the quality and integrity of data.
We’ve taken 5 minutes with Rob Jones to explore this topic and reflect on his past experience in applying them.
It’s crucial to note that these guidelines apply to non-clinical studies and not clinical studies (which are covered under GCP).
Can you explain the purpose behind the GLP principles?
Ultimately, GLP data provides the means by which experiments and studies can be reconstructed and the results generated can be verified and validated. They provide a quality system against which laboratory studies are performed, recorded, reported and subsequently archived.
If lab data is reported and retained accordingly, then in theory a study could be repeated multiple times with the same conclusion being reached each time. This means that labs across the world would be able to re-create your experiment and get the same results as you because the testing parameters are the same…because they’ve followed the quality measures of GLP.
Now, I would say that there are two crucial elements to point out here:
Firstly, GLP isn’t applicable to just clinical/pharma studies – they can relate to anything being experimented on within a laboratory setting.
And secondly, GLP is probably one of the broadest of the GxPs within life sciences and can be used on their own or alongside other GxP principles. For example, everything from bioanalytical experimentations for a pivotal phase three study, to nitrogen levels in fertiliser will deal with GLP.
How would you recommend that source data is stored so that they conform to GLP?
To start with, the outlines and steps for following these principles should be published within your organisation’s Standard Operating Procedures (SOPs) so that all parties involved are aware of the steps necessary to enforce and ensure quality.
It’s also vital that good laboratory data management should apply throughout the entire data lifecycle, from creation/capture through to final archiving. This also means that every step of your experiment must be recorded and the calibrations of equipment are correctly noted so that they can be repeated.
The main thing is to make sure the source data is available in a format that is accessible. If that means leaving it in a validated source system for the short-term then that is fine, but once you extract that data make sure it is in a format that does not rely on the source system to be read.
Can you run us through examples of how you have used these previously?
So, for example simple things like keeping a record of calibrated pipettes falls under GLP.
Theoretically someone could come along and point out:
“It says here you used these 4 pipettes – show me the log where it says you used these on ‘this’ date. When comparing against the log, they could identify:
- If the pipettes were within calibration, or at least had been tested within the allotted time period.
- If they were in date where are the logs that showed the corresponding calibration results and that other equipment use to test this were also within range, such as the balances? Etc.
This isn’t study data as it’s not attached to a single study, it’s in essence facility data for the lab you were working in at the time. It’s worth noting that you wouldn’t store this level of data in the TMF as it’s not attached to a study.
Just to expand upon this, the TMF refers to the Trial Master File which is a repository where all documentation and records relating to a clinical trial are stored.
The MHRA define it as “the collection of essential documents which allows the conduct of a clinical trial to be reconstructed and evaluated. It is basically the story of how the trial was conducted and managed.”
This is a mandatory requirement for all clinical trials.
Another example is the use of equipment logs, such as the use of a fridge.
Was the temperature in the fridge correct?
Tolerances play a big part in GLP – how long was it out of tolerance for? (for example the fridge should be 4 degrees but if it’s between 2 – 5 degrees it’s ok – you must track when it falls out of tolerance, when this happened and for how long.
Environmental factors outside of your control also need to be recorded. Let’s say there was a power cut – this could cause massive ramifications to the experiment. A couple of minutes of power loss is ok but what if it falls outside tolerance for several hours? This has the potential to dramatically affect the data’s results
What do you think an organisation needing to follow these guidelines need to be aware of?
First and foremost, GLP helps to maintain repeatability – if this is impossible then it’s not a proper experiment.
If you follow GLP and you design a process around it then you can repeat this experiment around the world and theoretically get the same results as all testing conditions and equipment (etc) are the same. The primary concern/challenge is correctly recording every detail.
When followed, these principles avoid wrongful results from happening.
06 Sep, 2021
Four crucial reasons why Dropbox isn’t an appropriate archive for pharma documentation
24 Jun, 2021
Webinar Recording: Practical Approaches to Digital Preservation for Life Sciences with Russell Joyce and Matthew Addis
25 Jan, 2022