Trust, but Verify (Continuously)

Sept. 25, 2012
FDA’s Process Validation guidance calls for continuous process verification. Here’s how to do it, and how automation can help.

In January of 2011, the FDA issued Guidance for Industry, Process Validation: General Principles and Practices. This document will affect how pharmaceutical manufacturers operate and presents them with a series of both challenges and opportunities.

The Guidance document, in its own words, “aligns process validation activities with a product lifecycle concept and with existing FDA guidance, including the FDA/International Conference on Harmonization (ICH) guidances for industry, Q8(R2) Pharmaceutical Development, Q9 Quality Risk Management, and Q10 Pharmaceutical Quality System. Although this guidance does not repeat the concepts and principles explained in those guidances, FDA encourages the use of modern pharmaceutical development concepts, quality risk management, and quality systems at all stages of the manufacturing process lifecycle.”

The document defines process validation as “the collection and evaluation of data, from the process design stage through commercial production, which establishes scientific evidence that a process is capable of consistently delivering quality product.” Validation activities are broken into three stages: Process Design, Process Qualification and Continued Process Verification.

This article will discuss the third stage, Continued Process Verification (CPV), the challenges and opportunities that this concept creates for pharmaceutical companies, and what this means to pharmaceutical manufacturers on a practical, day-to-day basis.

The guidelines identify three key program elements for CPV:

1. A system or systems for detecting unplanned departures from normal operation of the process designed to correct, anticipate and prevent problems.
2. An ongoing program to collect and analyze product and process data that relate to product quality, including evaluation of intra-batch and inter-batch variation. This data “should include relevant process trends and quality of incoming materials or components, in-process material and finished products. The data should be statistically trended and reviewed by trained personnel. The information collected should verify that the quality attributes are being appropriately controlled throughout the process.” 
3. Maintenance of the facility, utility and equipment qualification status: “Once established,” say the Guidelines, “qualification status must be maintained through routine monitoring, maintenance, and calibration procedures and schedules.”

Alignment with ICH Q10
ICH Q10, Pharmaceutical Quality System (June 2008) is a tripartite guideline that describes “a model for an effective quality management system for the pharmaceutical industry, referred to as the Pharmaceutical Quality System.” One of its major objectives is to establish and maintain a state of control: “To develop and use effective monitoring and control systems for process performance and product quality, thereby providing assurance of continued suitability and capability of processes.” To a large extent ICH Q10 embodies existing regional GMP requirements. Its coverage extends from development to manufacturing to product discontinuation.
Several key elements of ICH Q10 align with Continued Process verification as it relates to the “Manufacturing” stage. Two of the most important are:

  • Knowledge Management, which ICH Q10 considers one of its two key enablers (the other being Risk Management) and is defined as a “systemic approach to acquiring, analyzing, storing and disseminating information related to products, processes and components.” The data historians found in modern distributed control systems (DCSs), often in combination with an electronic batch record system, perform this function.
  • Continual improvement of process performance and product quality, described in Section 3 of ICH Q10. Regarding the commercial manufacturing stage of the product lifecycle, ICH Q10 states “the pharmaceutical quality system should assure that the desired product quality is routinely met, suitable process performance is achieved, the set of controls are appropriate, improvement opportunities are identified and evaluated, and the body of knowledge is continually expanded.”

In order to meet many, if not most of the goals of all these guidelines, data collection and availability should be consistent and easily accessible to users. These goals can be achieved using current automation and business technologies that include Distributed Control Systems (DCS) and Manufacturing Execution Systems (MES). For example:

  • Handling unplanned departures — A modern DCS has an alarm strategy and can be configured with preplanned actions to respond to unwanted changes in continuous data (e.g. temperature of a bioreactor). The system can provide exception reporting through its event/history log. Use of the DCS in combination with electronic batch records (EBR) would provide more granularity to exception reporting through viewing exceptions within context of the batch. Use of EBR in an integrated MES would provide the ability to reconcile and/or integrate data with separate deviation management systems. And finally, Process Analytical Technology (PAT) in-line instrumentation and modeling can be incorporated into the automation strategy to respond to and correct process deviations in real time.
  • Process monitoring — A DCS collects a considerable amount of process data on an ongoing basis that includes critical process parameters from moment-to-moment loop control operations via reporting of trends. Additional process data may be generated outside of Manufacturing through Quality Control testing.

Effective monitoring of the process requires both sets of data. Integration of an EBR with Laboratory Information Management Systems (LIMS) would provide a central repository for data when test results are provided back to the batch record. Potential delays in obtaining laboratory data should be considered as they affect the ability to respond to results in real time. Identification of PAT with online testing in key areas would contribute to the success of continued process verification where a response can occur in real time.

A final data set is required for raw material information from sources such as certificates of analysis from suppliers (COAs) and/or site release testing. If this data is collected in an ERP and/or LIMS system, it can also be integrated into a central repository to provide ease of use for process analysis.

  • Maintenance of facility, utility and equipment — The data handled by a DCS can be used to monitor the health of the equipment and process. Alarm records and key utility and equipment performance parameters, as well as data from intelligent field devices, can be collected and analyzed. All of this data can be correlated with asset management and MES systems to aid in predictive/proactive maintenance.

As explained above, facilities that use modern distributed control systems already have access to the data required for Continued Process Verification. But in many cases, these systems collect an enormous amount of data that is useful in some ways, but is superfluous or irrelevant in others. How can Management decide what is important?

Getting the control strategy right
The first step is to make sure that the process control strategy for the facility is appropriate, and that process, reporting, and quality parameters are identified and assigned. These include input parameters such as equipment settings (e.g. agitation rate) and output parameters (e.g. pH achieved based on assignment and use of the input parameter agitation rate). How critical are each of these? Not all parameters are critical. The assignment of Critical Quality Attributes should be based on a careful risk assessment with risk to patient as the focus.

With the key parameters identified and the critical ones called out, it’s time to define the statistical process control ranges based on live data (batches performed to provide enough sample size to provide assurance of control). Control ranges should be tighter than acceptance criteria.

In those key areas where analytical methodologies already exist, the next step is to implement PAT methods, incrementally. Note that much more data will be generated from the PAT methods than from manual laboratory methods. Where there may have been only one data point in a laboratory for a particular attribute, there might be hundreds of data points with a PAT method. What is the appropriate response to that?

What is the appropriate sample frequency?
The quality of the raw materials also has an effect on product quality. What are the key attributes, whether received from vendor testing on a certificate of analysis (COA) or tested on-site as part of release process? It’s important to evaluate current data collection methods to ensure these key attributes are funneled to the appropriate place for analysis.

Deviation monitoring
Earlier we mentioned deviation monitoring systems. It’s important to make sure that deviation monitoring is kept in a single system, rather than maintaining bits of data scattered across systems. Keeping all the data together can help to ensure that all excursions are identified and that the associated data can be trended, so that reports can be generated from a single source. This requires a system with the flexibility to report issues and investigate. The decision to launch an investigation should be predicated on potential impact.

The deviation monitoring system should include the appropriate data fields and reporting mechanisms to reduce the effort expended on data mining. The system should be made available to the plant’s MES to allow for real-time documentation and reconciliation of issues as manufacturing occurs. To make this possible, it’s important to evaluate the deviation process workflow. Does the system require a minimum number of fields to be populated before a record can be saved (and therefore receive an identification number)? If so, are those fields appropriate for receiving data from an MES to have successful reconciliation?

Program definition and procedures
Many companies already have strong process monitoring initiatives, but not all of them tie back to the validation program and have the appropriate quality oversight; the goal is to combine process monitoring data with the validation program to comply with CPV requirements and derive business and process benefits. The CPV program should be maintained in alignment with the master validation plans for each site and/or process. Note again, this is recommended for products after the initial validation has been done, so it can be implemented for a company’s current commercial products once the appropriate reporting strategy is developed. In addition, if the output of the monitoring program is currently provided only to the technical staff, the audience list should be updated and a formal review program instituted to include the Quality Unit and other validation roles. The CPV program then becomes a robust mechanism for identification of improvements and obtaining collective agreement on implementation of changes.

The formal program should identify the frequency for reporting and identify the mechanisms for investigation and follow-up. This should include the definition of the audience for reporting, as well as oversight by the Quality Unit. Typically, the process, roles and responsibilities would be defined in a Standard Operating Procedure under GMP requirements to ensure consistent process and training.

Where real-time data from DCS, MES and/or PAT cannot be employed or a company is not ready to make that leap, it is still possible to achieve CPV through incorporating laboratory data into the program. The frequency of reporting may differ based on lag time to receive results, but analysis and improvement opportunities may be derived on a real-time basis rather than traditional batch review during release processes; with additional benefit derived when data is evaluated batch to batch instead of evaluating only a single batch.

In short, the drivers behind FDA’s most recent process validation guidance represent good science and facilitate continuous improvement on the part of suppliers; both are critically important factors to the needs of the patients and the reputation of the industry. The guidance challenges pharmaceutical manufacturers to achieve Continued Process Verification.

About the Author

Heather Schwalje | Emerson Process Management