Since the initial April 2016 draft published, the pharmaceutical industry has been working to toward complying with the FDA guidance on Data Integrity and Compliance With Drug CGMP (here) which was finalized in December 2018.  One passage in the introduction of the guidance has resonated with me: “Management’s involvement in and influence on these strategies is essential in preventing and correcting conditions that can lead to data integrity problems.  It is the role of management with executive responsibility to create a quality culture where employees understand that data integrity is an organizational core value and employees are encouraged to identify and promptly report data integrity issues. In the absence of managements support of a quality culture, quality systems can break down and lead to CGMP noncompliance.”  This passage directs company management to take the responsibility to create a quality culture, which, in turn, should result in robust laboratory documentation processes and practices that will assure the integrity of laboratory data.  This is not a new concept.  For decades, the documented hardcopy laboratory data in bound notebooks was reviewed and any discrepancy in the data was easily detectable and could be questioned.  However, even with hardcopy data, the validity of the data was directly impacted by the integrity of the bench chemist and laboratory management’s commitment to ensure the complete and accurate data is recorded.

With the introduction of computerized chromatographic data acquisition systems in 1980’s and 1990’s, a conundrum was created.  Was the printout of the data or was the electronic file the raw data?  If the raw data was the electronic file, was the electronic file secure from being easily altered?  In some systems, the files could be written over, renamed, etc., without any audit trail documenting the action.  Thus, assurance of the integrity of the laboratory data was weakened unless laboratories added configurational or procedural controls over the data acquisition systems.

Fast forward to present day, most commercial off-the-shelf software associated with computerized laboratory systems can be configured to address the majority of data integrity gaps present in earlier software.  However, even with the improved data security of the current systems, the “Part 11 Compliant Software” is not the panacea that laboratory management would like it to be.  While up-to-date systems aid in the prevention and detection of data loss and common record manipulation, a vigilant, thorough, comprehensive review of audit trails for unaccountable extra data, unapproved reprocessing, and other causes for data integrity breaches is still needed.

If we think about it, nothing related to the expectation for data to be complete and accurate has actually changed over the past decades.  If company management does not fully commit to a quality culture, where ensuring data integrity is a paramount core value and this is reflected in their actions, the resulting data integrity compliance practices will likely not meet regulatory expectations.  The commitment to a quality culture from the top down is key.  Having compliant instrumentation, procedures, and practices will also give the laboratory analysts pride in knowing that their data is beyond reproach and will pass the most rigorous of audits.

For further information and assistance on the topic of laboratory data integrity, please contact Tim Rhines, Ph.D. at t.rhines@lachmanconsultants.com.