There are new regulations and data repository platforms (e.g. EU Portal) on the horizon that will affectthe data management landscape and how we operate in the future. The newly revised ICH E6 (R2) GCP Addendum and other proposed regulatory changes will significantly impact clinical trial decision making, planning, conduct and monitoring. As data managers, we will need to maximize the use of this new technology by expanding our knowledge and awareness of how these changes will affect building the systems and collaborative relationships needed across organizations.
Establishing and enhancing the training matrices used within and across organizations are essential to the rollout and implementation of these changes. We will need to promote cross-training in the data management techniques used to identify risks as well as statistical considerations and concerns. The new regulation will encourage the implementation of improved and more efficient approaches to clinical trial design, conduct, oversight, recording and reporting.
New Regulation Gives Data Managers More Control
Data managers play a pivotal role in successful clinical research programs ensuring they are flexible and adaptable to business needs. Part of this requires us to be collaborative and proactive in identifying new processes and technology for improved efficiency for all clinical trials going forward.
The new EU Portal, currently targeted for 2018, will delineate how we should identify, track and plan for the upcoming regulation, as well as define new technical requirements. It is critical that all key stakeholders communicate frequently and effectively to target the best collaborative plans and approaches for conducting clinical trials.
The revised ICH E6 (R2) GCP requirements will provide a unified standard that will facilitate mutual acceptance of clinical data by regulatory authorities (e.g. EU, Japan, U.S., Canada and Switzerland). It increases efficiency by focusing on what’s important and relevant in the application of new technologies.It also guides sponsors on implementing quality management systems (QMS) (as seen in Section 5.0).
The new regulations require us as data managers to help ensure the investigators have control of and access to reported CRF data. This access is also important for clinical operations and monitoring personnel. We will need to think outside the box, and come up with more results-driven technology. One way to include these concepts as we implement a QMS is to manage quality throughout the design, conduct, recording, evaluation, reporting, and archiving of clinical trials.
As data managers, we need to define how we plan to monitor on an ongoing basis, the delivery of quality at every step for trial activities and data collection/generation. It requires us to look closely at the methods being used to assure the quality of clinicaltrials, and to make sure that the controls identified are proportionate to the risks. What’s clear for sponsors is the need to avoid unnecessary complexities in procedures and in data collection. Our aim should be to manage the risks to the integrity and outcomes of our data. In a previous article, I stated that “without integrity, your data could be worthless.” This statement will always be true when designing trials for clinical research. The value of the data is important as the aim is to use it to save lives; hence, the management of ensuring the correctness and accuracy of it, is most important.
Focusing on the issues that matter
As the head of the data management group at Advaxis, it is my responsibility to ensure that an established written procedure is identified for the management of our trials, incorporating the new ICH E6(R2) regulations. One way to initiate this process would be to establish greater oversight over the CROs and vendors electronic systems that we use in our trials.
We need to monitor performance and quality over time and to ensure continued use of validated computer systems, validation by reviewing the documentation each time the system is updated and changed,and the performance to date. Change without improvement in outcomes isn’t a good use of time or resources, so change control is very important to mitigating risks. We must track and monitor performance, look for trends, assess the reliability of our assumptions (based on data), and implement effective changes in processes and procedures when needed.
We need to focus on the issues that matter. This is a process that needs to start early in the development of studies. Many will have different ideas about what these issues are, but from my perspective there are four major areas to explore. The first is most definitely the data quality and integrity (A.L.C.O.A.) as defined now in the new GCPs. Focusing on this area includes managing the timeliness of data entry, late entry and errors, as well as QC and SDV (source data verification) of the data. Second, will be the management of queries. The type, volume and aging of the queries will need to be closely managed to ensure timely cleaning and delivery of the data.
System generated queries are a good way to see what, if any, protocol, data collection and assumptions made in the design phase hold up. Our third focus must be the data assessment and analysis of the data. We need to identify what can be programmed and what needs to be managed manually. Ideally, having everything programmed would be the best option, but we know that in clinical research, a lot of data needs to be confirmed and cross-checked. Because of this need for cross checking, our role also involves ensuring that only data that are “analyzable” is captured in the database. Finally, we are an important part of the process when the clinical team is defining risk triggers and thresholds, as we will need to implement these triggers and thresholds in the database. By doing this upfront, and monitoring performance over time we have a better chance of filtering out or raising issues that may or may not be important.
No Need to Re-invent the Wheel
The implementation of this new regulation and GCP requirements has led to new systems based approaches and the development of best practices that can be applied across the industry (e.g. TransCelerate; AVOCA) making new tools and insights available. Why re-invent the wheel when the information and processes are a key stroke away? These organizations offer tools to help us measure high risk data management targets, gather metrics and identify threshold triggers. TransCelerate and AVOCA have different but compatible approaches and some great tools available for use when gathering quantitative metrics. The TransCelerate RACT tool is a great starting point for identifying levels of risk in each area of clinical research, not only in data management.
As the implementation of the new European regulations, platforms and revised ICH E6 (R2)GCPs are rolled out, I believe that the data management and the clinical team will become more integrated, sharing responsibility and rolesfor ensuring the following:
- Original source documentation and any edits are auditable in the future (A.L.C.O.A.)
- Requirements are established for the validity, longevity and fidelity of trial data in digital records
- Smooth transitioning from paper systems to digital records
- Documented change controls of updates in digital systems and/or changes in technology
- Avoiding situations where real-time data aggregation and visualization may inadvertently and inappropriately influence trial outcomes early in the trial process
- Mitigating concerns that digital trial databases can obscure unauthorized changes to primary data
Ultimately, we should not be afraid of the new regulations and requirements. Instead, but we shouldembrace them as a step in the right direction to ensuring that all data, processes and procedures are validated in such a way that the integrity of our data will never be in question. This can offer us all a smoother path to getting data submitted and approved by Regulatory Authorities and life-saving drugs to patients.
Senior Director, Data Management