Standardizing data collection and data transformation in a way that is compliant with the relevant regulatory requirements, is challenging. In light of ever-changing regulations, maintaining SDTM (Study Data Tabulation Model) standards that are flexible and harmonized with industry standards have become vital.

Although supporting internal infrastructure may be eased by the help of technological advancements, the constant need of monitoring and oversight may still be complicated. When it comes to the everyday struggles of data management, the implications of a minor mistake in your standards could be far-reaching. The time has now come to confront the issue at hand.

Noticing even a minor mistake too late may lead to a rejection of your drug application, potentially causing a several million dollar loss. Hence, it is not surprising that every company is trying to figure out how to efficiently deal with the standards. While there are several available websites with online tutoring and mentoring, no mentoring can thoroughly prepare you for the hours’ long customisation of Seraph, for instance. So after taking this into consideration, the real question we should be asking is: how can ensure the quality of your data in the midst of a trial?

We shall approach this from two perspectives. While the technological side of it is hugely dependent on the budget the sponsor has available, the human perspective (i.e. your staff) is very much in your hands. After 40 research calls with professional data managers it has become clear to me that due to the lack of understanding, many mistakes have been made largely due to ignorance. A necessity has arisen to learn how to apply these standards in real life situations. This discussion will additionally focus on different functions. And while there are technologies that can be a huge part of the solution, without proper training, such technologies are useless.

Even the simplest tools, like a library of standards, may become difficult to deal with without properly understanding the requirements of your study. And although every ‘book’ in this ‘library’ has its own specifications, each one is chosen accordingly to meet the needs of the study. So once the ‘library’ is designed to be compliant with all the standards one doesn’t need to worry about compliance at the study level. However, one must be aware of the ever evolving and dynamic nature of data standards that may require further review of processes to ensure full compliance with the latest guidelines.

Therefore, it is of the utmost importance to create an internal systematic structure covering every part of your data handling from data capture to actual submission. Once a systematic approach is applied to your processes and these are customised to your own company needs, the process can be reapplied and used as a standard set of processes to achieve efficiency in data. As a result this should also improve the internal communication channels within your company.

Since data standards are required for all submissions, it is vital to create consistency and efficiency in your data collection. Although industry standards can be vague, it is critical to seek guidance and implement the right interpretation of these standards. There are several trainings on industry standards out there, so the only advice here is to ask around. And while understanding the standards is vital, the sustainability of your system of processes and standards in place will be questioned several times in the long run as well. The nature of the industry is just too complex and ever-changing, so once you’ve mastered your current standards and managed to create your system of organizational standards, you must stay involved and ensure your engagement with standards groups, like CDISC (Clinical Data Interchange Standards Consortium). And while it may seem like too much of an effort, bear in mind the reasons behind it; creating new drugs to help people recover from their illnesses sooner and with fewer side effects.