It’s late, do you know where your data is?  Are you really sure?  Would you be able to assure a regulatory agency that you fully understand and control the flow of data from data generation through processing, reporting, data review, archive and retrieval?  If the answer to all the above questions is yes, congratulations!  Unfortunately, many laboratories, clinical, and manufacturing functions are likely not in such a comfortable state of compliance with the current regulatory expectations for data accuracy, reliability, and security.

In the wake of highly publicized FDA 483s and Warning Letters, most companies are making efforts to enhance their compliance with regulatory agency data integrity expectations.  Fortunately, the regulatory expectations have been communicated through several guidances (drafts or effective) from the FDA, MHRA, EMA, WHO, and PIC/S.

With all the industry efforts and available guidance, why are there still so many companies struggling with compliance with data integrity expectations?  Like the great basketball coach John Wooden stated, “It’s the little details that are vital. Little things make big things happen.”  From the corner office, the details are not always obvious, and even less obvious is how to address the details.  When it comes to data integrity, a few new SOPs and pieces of equipment are of little more value than new team uniforms toward ensuring success.  What is essential is a Data Governance system.

A Data Governance system is defined as, “The sum total of arrangements to ensure that data, irrespective of the format in which it is generated, is recorded, processed, retained and used to ensure a complete consistent and accurate record throughout the data lifecycle.” (MHRA, “GMP Data Integrity Definitions and Guidance for Industry”, March 2015).  The key to Data Integrity compliance is the “sum total” of the systems and procedures to address the details.

To this point, it is very important that the data flow path for all manufacturing equipment, laboratory instruments, and clinical studies is fully understood and documented by a detailed process flow map.  The data supporting the manufacture and testing (including clinical testing) of pharmaceuticals originates from nearly countless diverse sources in a firm (including organoleptic observations, basic gauges, PLC devices, balances, meters, medical instruments, standalone computerized equipment, and on up to server-based networked applications).  Recently, several equipment vendors have even announced “Cloud” applications.  It is unfathomable to think that a few SOPs alone can address the details this diversity presents.

The PIC/S guidance states “Risk assessments should focus on a business process (e.g. production, QC, clinical studies), evaluate data flows and the methods of generating data, and not just consider IT system functionality or complexity.”

Only by documenting all sources of data and preparing detailed data flow path maps for every data source will data integrity gaps become apparent.  It is almost certain that when flow mapping a data path, additional gaps posing potential risks to the integrity of the data will be identified.  Until the risks are identified and remediated, they will remain.  Once the remediation steps are complete, having detailed data flow maps readily available greatly facilitates the ability to assure regulatory agencies that your procedures and systems ensure the accuracy, reliability, and security of your GxP data.

For further information and assistance on the topic of Data Flow Mapping and the importance of taking a proactive approach, please contact either R. George, Ph.D. at r.george@LachmanConsultants.com or J. Davidson, Ph.D. at j.davidson@LachmanConsultants.com

 

References:

  • PIC/S Draft Guidance, Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments, August 2016 (available for download here)