Data integrity is the cornerstone of successful clinical research. Without solid integrity, the data collected in clinical trials is worthless. Ensuring data integrity is as much for the research, as it is for the patients who will receive treatment based on that research. The code of Federal Regulations (CFR) parts 210, 211 and 2012, addresses the role of data integrity in clinical research. The FDA also released in April 2016, a “Data Integrity Guidance” to clarify the FDA expectations on this topic.
The minimally defined standards used during data collection will help to ensure the integrity of the data. The data needs to be complete (free from missing information), reliable and processed in a consistent manner. Use of best practices throughout the data collection cycle will ensure data that is quality. The risk is low when best practices are initiated at the start of a clinical trial.
The data generated should be reliable. By reliable, it should represent the purpose(s) for which it is being collected. For example, with oncology data, where you are tracking progressive disease, your data is worthless if you don’t capture the best response (BR)data.
A key part to managing quality of the data is to ensure that the CRFs/eCRFs are well designed. A poorly designed CRF/eCRF will lead to inadequate data collection and hence poor data integrity for the data collected because the required information may be missing or inadequate.Also, as part of managing CRF data, it is important to ensure that thedata entry is accurate. This is achievable by source data verification at the sites(s) where the data is being entered.
Additionally, having solid edit checks programmed into an EDC or proprietary system, is tantamount to having clean, quality data. The edit checks will trigger inconsistencies in the data from many different perspectives, such as illogical data, data out of range, missing data, discrepant data, and so forth. Manual checks, in addition to the programmed checks will also assist the Data Manager in reconciling the data when programmed checks are not doable. Overall, the edit checks are a warning to the Data Manager that something isn’t right.
Proper setup and validation of the EDC system, plays a key part in ensuring your data will have the required integrity needed for acceptance by the FDA. The EDC should only be set up using a final version of the protocol. Using an incomplete or near final version of the protocol, will probably require rework, which may incorrect affect how data is captured, which may be lead to poor quality and compromised integrity of the data. Per the FDA, validation of systems should be done “to ensureaccuracy, reliability, consistent intendedperformance, and the ability to discerninvalid or altered records….”
Data integrity can also be checked by reviewing the data in parts and then as a whole. For example, review the Adverse Event data in a spreadsheet for all subjects across a clinical trial. This allows you to see outliers, i.e. protocol violations, trends, etc. Review of all data for a subject as a whole, namely, the patient profile, is an excellent way to maintain the integrity of the data. It allows you to see inconsistences across a subject and to correct that data before it becomes final.
One major area which may affect the integrity of your data is having mid-study changes. Hence it is extremely important to avoid making system wide changes mid-study, unless it is absolutely necessary (i.e. protocol amendments). It’s better to think about the possibility of changes when initially designing the study to avoid mid-study changes.
An excerpt from the Food and Drug Administration (FDA) document, Electronic Source Data in Clinical Investigations and Regulatory Expectations, state, “Electronic data must meet the samefundamental elements of data quality (e.g. attributable, legible, contemporaneous, original, and accurate) and integrity (complete andconsistent) expected of paper records. Acceptance of data from clinical trials fordecision-making purposes depends on FDA’sability to verify the quality and integrity of the data”.
In conclusion, there are many ways to maintain the integrity of your data. It is your obligation to ensure that the right steps are taken and documented, that will ensure data integrity. The key is to think about how you will manage this before the study startup begins. You must ensure that you have well-documented processes and procedures, etc. Integrity is easier to maintain when the risk is low, which is usually at the start of the study. Once the study has started, back pedaling will be a bit more tedious and lot more challenging for you.