In January of 2011, the FDA issued Guidance for Industry, Process Validation: General Principles and Practices. This document will affect how pharmaceutical manufacturers operate and presents them with a series of both challenges and opportunities.
The Guidance document, in its own words, “aligns process validation activities with a product lifecycle concept and with existing FDA guidance, including the FDA/International Conference on Harmonization (ICH) guidances for industry, Q8(R2) Pharmaceutical Development, Q9 Quality Risk Management, and Q10 Pharmaceutical Quality System. Although this guidance does not repeat the concepts and principles explained in those guidances, FDA encourages the use of modern pharmaceutical development concepts, quality risk management, and quality systems at all stages of the manufacturing process lifecycle.”
The document defines process validation as “the collection and evaluation of data, from the process design stage through commercial production, which establishes scientific evidence that a process is capable of consistently delivering quality product.” Validation activities are broken into three stages: Process Design, Process Qualification and Continued Process Verification.
This article will discuss the third stage, Continued Process Verification (CPV), the challenges and opportunities that this concept creates for pharmaceutical companies, and what this means to pharmaceutical manufacturers on a practical, day-to-day basis.
The guidelines identify three key program elements for CPV:
1. A system or systems for detecting unplanned departures from normal operation of the process designed to correct, anticipate and prevent problems.
2. An ongoing program to collect and analyze product and process data that relate to product quality, including evaluation of intra-batch and inter-batch variation. This data “should include relevant process trends and quality of incoming materials or components, in-process material and finished products. The data should be statistically trended and reviewed by trained personnel. The information collected should verify that the quality attributes are being appropriately controlled throughout the process.”
3. Maintenance of the facility, utility and equipment qualification status: “Once established,” say the Guidelines, “qualification status must be maintained through routine monitoring, maintenance, and calibration procedures and schedules.”
Alignment with ICH Q10
ICH Q10, Pharmaceutical Quality System (June 2008) is a tripartite guideline that describes “a model for an effective quality management system for the pharmaceutical industry, referred to as the Pharmaceutical Quality System.” One of its major objectives is to establish and maintain a state of control: “To develop and use effective monitoring and control systems for process performance and product quality, thereby providing assurance of continued suitability and capability of processes.” To a large extent ICH Q10 embodies existing regional GMP requirements. Its coverage extends from development to manufacturing to product discontinuation.
Several key elements of ICH Q10 align with Continued Process verification as it relates to the “Manufacturing” stage. Two of the most important are:
- Knowledge Management, which ICH Q10 considers one of its two key enablers (the other being Risk Management) and is defined as a “systemic approach to acquiring, analyzing, storing and disseminating information related to products, processes and components.” The data historians found in modern distributed control systems (DCSs), often in combination with an electronic batch record system, perform this function.
- Continual improvement of process performance and product quality, described in Section 3 of ICH Q10. Regarding the commercial manufacturing stage of the product lifecycle, ICH Q10 states “the pharmaceutical quality system should assure that the desired product quality is routinely met, suitable process performance is achieved, the set of controls are appropriate, improvement opportunities are identified and evaluated, and the body of knowledge is continually expanded.”
In order to meet many, if not most of the goals of all these guidelines, data collection and availability should be consistent and easily accessible to users. These goals can be achieved using current automation and business technologies that include Distributed Control Systems (DCS) and Manufacturing Execution Systems (MES). For example:
- Handling unplanned departures — A modern DCS has an alarm strategy and can be configured with preplanned actions to respond to unwanted changes in continuous data (e.g. temperature of a bioreactor). The system can provide exception reporting through its event/history log. Use of the DCS in combination with electronic batch records (EBR) would provide more granularity to exception reporting through viewing exceptions within context of the batch. Use of EBR in an integrated MES would provide the ability to reconcile and/or integrate data with separate deviation management systems. And finally, Process Analytical Technology (PAT) in-line instrumentation and modeling can be incorporated into the automation strategy to respond to and correct process deviations in real time.
- Process monitoring — A DCS collects a considerable amount of process data on an ongoing basis that includes critical process parameters from moment-to-moment loop control operations via reporting of trends. Additional process data may be generated outside of Manufacturing through Quality Control testing.
Effective monitoring of the process requires both sets of data. Integration of an EBR with Laboratory Information Management Systems (LIMS) would provide a central repository for data when test results are provided back to the batch record. Potential delays in obtaining laboratory data should be considered as they affect the ability to respond to results in real time. Identification of PAT with online testing in key areas would contribute to the success of continued process verification where a response can occur in real time.