Back in May 2017, my colleague and I wrote an article, Data Integrity Control Strategies – A Matter of Gaining Control. It outlined: a) the importance of Data Integrity as one of the hottest topics in GMP compliance, and b) How a Data Integrity Control Strategy is a valuable tool to document tangibly the controls in place for your GMP systems.
The focus of this article is to reiterate that data integrity is not just a temporary focus by regulatory bodies. Furthermore, I will share personal experiences related to the challenges of identifying the right level of compliance and how that was managed – through understanding the importance of a risk-based approach.
In the Drug GMP Report, Issue No. 306 from the FDA writes, “quality experts and FDA officials predict 2018 will bring more streamlined FDA inspections and a greater focus on digital records and data integrity.”
Paper Records vs Electronic Systems
To put this into perspective, the only way regulatory bodies can tell a batch in the field has been appropriately released is from the batch record(s) or data. If there is something questionable about the data then that raises all kinds of questions about the suitability of the product quality and safety.
Today, I would postulate that 90 percent of pharmaceutical companies, with a history of approximately 15 years (or more), would no doubt have a combination of legacy systems using paper-only and sophisticated electronic systems. In my experience, most systems actually fall under the category of hybrid systems whereby electronic data are being generated, processed and stored on the system, whereas paper is printed and defined as being the raw data to show under regulatory inspections.
The catch-22 related to this discussion is that companies feel compelled to move away from paper-based systems to electronic systems in order to increase quality, reliability, and in some cases, the quantity of data due to technological advancements. However, by doing so the issue of maintaining an appropriate level of data integrity for electronic records arises.
Understanding Guidance vs Requirements
Data Integrity has been on the agenda since 21 CFR Part 11 from 1997. The literature, related to both the guidance in the authority’s interpretation in cGMP requirements, as well as their own guidance, has grown.
The following is a list of all relevant regulatory literature:
- FDA 21 CFR Part 11
- FDA Guidance for Industry Data Integrity Compliance with cGMP, April 2016
- MHRA GxP Data Integrity Guide, March 2018
- WHO Guidance on Good data and Record Management Practices, Annex 5, Sept. 2015
- EC Good Manufacturing Practice, Volume 4, Annex 11
With so much literature, no doubt it is a daunting task to make head or tails out of the requirements, which are mandatory and which are for guidance. It is important to highlight that all the literature, albeit using at times different terminology are actually saying the same thing.
For instance, the MHRA’s Guidance for Industry on Data Integrity (2015) defines data integrity as: The extent to which all data are complete, consistent and accurate throughout the data lifecycle.
Whereas, the FDA Guidance for Industry Data Integrity Compliance with cGMP has a similar, but different definition: Data integrity refers to the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should be attributable, legible, contemporaneously recorded, original or a true copy and accurate (Alcoa).
It is commonly agreed and accepted that when we talk about data integrity, it’s essential to ensure compliance with the ALCOA principles throughout the data lifecycle, as seen in figures 1 and 2 below:
Even with understanding the essence of the data integrity principles, there are a number of common questions raised when actually talking to system owners, data owners, operators and quality advisors. Below is just a handful of the type of questions I receive:
- Just how much control do I need? Is it the same for all GMP systems? Which systems should we address first?
- We know that a new version of software will give the system more electronic data compliance, but who can do that for us?
- If I have defined my raw data as being paper, does that mean I can ignore the electronic data?
- My system is so old or we cannot find an adequate replacement system to improve the quality of our electronic data. What should I do?
- Do we need to move from a standalone system to a network connection?
- What is the difference between Static and Dynamic data, and does the level of control differ?
- How much ‘metadata’ is required on a paper-based report in order to approve?
- How do you conduct a review of the audit trail?
Each question in itself is a whole topic of discussion, dependent very much on the type of work the equipment is conducting, the complexity of the system, and the maturity level of the software/hardware/infrastructure in use. Therefore, I will touch on the first question in this article:
Just How Much Control do I Need? Is it the same for all GMP Systems? Which Systems Should we address first?
In order to address the question, we need to discuss the importance of a risk-based approach when we talk about control measures. This means we need to address four important elements:
- The complexity of the system/computerization level
- The risk to patient safety/product quality
- The capability/maturity of the software
- The process flow of the system
First, we identify the system complexity level. By knowing and understanding the criticality, we can also then attempt proportional control steps to secure our systems. Figure 3 below, from the MHRA GMP Data Integrity Definitions and Guidance for Industry, illustrates the spectrum of simple machines on the left, to complex computerized systems on the right.
Second, we need to evaluate the risk to patient safety and product quality for the systems. Some systems may be more directly involved in producing data, which can affect the patient, and the quality of the product compared to others. By doing so, we can prioritize which systems to improve compliance.
Third, even though you may have a complex system where there is a need to proportionally ensure control measures, the capability or maturity of that system’s software plays a huge part in what level of control in its current state can be done. The maturity level is evaluated based on a number of critical parameters whereby data integrity requirements are important:
- Presence of an audit trail and the ability to configure the audit trail for desired purpose
- Security – user access control, notification(s), electronic signatures
- Reporting configuration, how much data can we choose to see?
- Integrated functionality for conducting data backup
Lastly, the process flow of the system will give an indication to how many ‘data points’ (interfaces in and out of different equipment) there are. The following system is an example of a ‘simple’ data flow:
In this example of the scanner, you need to ensure your data can be trusted and the mechanisms are in place to ensure that at each data point. Whereas the example below of a complex data flow demonstrates you need to do more to prove your data is reliable and under control, either through technical or procedural controls.
The following (table 1) shows a simplified sample risk assessment for two different equipment using the four elements as described above:
Conclusions and Take Away
This article aimed to give an oversight into how a risk-based approach using four elements can help your company prioritize which systems to focus on first, and to help evaluate the level of control for a particular system.
A risk-based approach will always be subjective; however, it will provide your organization with a strong foundation in order to take appropriate decisions.
We all strive to bring compliance for our equipment to the right level. However, in the real world, there are many challenges and obstacles, which prevent us from creating the perfect solution.
My message is that accepting that sometimes ‘good’ is not always the best solution is perfectly fine if you can justify and demonstrate, by using the above tools, that a better solution is not technically feasible, economically practical, or impact critical.
During an audit, you need to demonstrate to the regulatory bodies the quality of your data can be trusted and that if someone has malicious intent to falsify records, that a technical or procedural control is in place to catch or prevent that.
Demonstrating that due diligence in the form of a formalized risk-based approach has guided the decision you have made for data compliance is far better than trying to fix everything, but showing no formal decision-making process.