Processing of the laboratory instrumentation output such as the processing of chromatographic raw data continues to be a focus of global  health agency investigators. This is not surprising, as the investigator will be aware of the Data Integrity (DI) risks with such a manual operation, where there is the potential of selectively adjusting integration settings along with the possibility of “previewing” results, manual integration, etc., whereby there is the opportunity of processing raw data to achieve a desired result rather than an accurate finding.  So, what are the best practices for the QC lab? Obviously, raw data must be processed in order to generate the reported result; however, the key is to know and understand the risks as it relates to raw data processing for that specific method / system and implement appropriate Data Governance (DG) controls (preventative and detection) to address those risks along with continually monitor for their effectiveness via Quality Oversight.

A firm should develop and validate robust analytical test procedures which include a specific processing method which contains defined raw data processing settings. As part of approving the newly validated processing method (containing the validated processing settings), the firm should approve and “lock” the associated processing method on the system such that any adjustment from what was validated would require Quality Unit approval with  documented rationale (via change management) to allow the modification of the electronic processing method by the system administrator.

It is recommended that the analytical methods processing conditions are considered at each stage of the analytical lifecycle so that when defining a methods Analytical Target Profile (ATP), there is consideration of the potential requirements / challenges for the processing method (to achieve the ATP). Firms need to understand that, when demonstrating the suitability of the analytical test procedure, equal importance must be given to the instrument settings that are associated with processing the raw data along with those that acquire the raw data. Typically, the focus is on the latter during method development and validation resulting in validated data acquisition settings being defined in the test procedure while the processing method is not always developed/validated and defined in the test procedure and are set based on the needs for that analysis by the analyst (hence, the DI risk). However, it is clear (from Agency citations and the guidance documents) that both need to be considered with equal importance and are fundamental to analytical lifecycle management. Therefore, during method development and validation, the firm needs to define the processing method with an understanding the impact that associated processing settings have on the accuracy of the reported results when adjustments of those settings are made so that ranges can be set where appropriate.

The analytical methods control strategy should demonstrate the suitability of the established processing method which would be based upon the results of the method’s system and sample suitability. It is therefore paramount that the design of the methods system and sample suitability is sufficiently robust to confirm acceptability of the processing method and that sample release test data (with the methods processing method) can be generated with confidence. Therefore, method development should define the requirements of the reference standards; for example, a highly purified standard may be appropriate to establish the sequence response factor (for sample quantification) but it may not be appropriate to confirm suitability of the processing method where a more complex reference profile is needed, such as that associated with a force degraded sample.

The firm’s quality system should necessitate that when a system / sample suitability requirement has been exceeded, an investigation is initiated to determine the cause. The temptation is that the analyst notes that adjustments can be made to the processing method to afford a passing result (due to the aforementioned DI risks with raw data processing); however, such failures of system/sample suitability can arise due to various scenarios such as system performance, set up errors, degradation of reference standard, and for sample suitability, the failure may reflect a change in the material attributes from manufacturing process change/drift. Hence the need to investigate and determine the cause where (based upon the root cause) the action from the investigation is that the processing method may need to be modified via the firm’s change control process.

If you have any questions relating to your site’s Analytical Method Lifecycle Management and Data Governance Programs, please contact Paul Mason at p.mason@lachmanconsultants.com.