Abbott's Bridge to ASTM Risk-Based Commissioning and Qualification

May 12, 2010
A gradual transition to better C&Q practices is warranted, as is "getting rid of wasteful, bureaucratic practices that have developed over the years," says Abbott's David Dolgin.

Science- and risk-based concepts can and should be applied to commissioning and qualification, but it’s not realistic for manufacturers to make the leap to a completely revamped process, said David Dolgin, Abbott’s senior quality program manager, speaking at Interphex 2010 in New York City in April. In the 30-plus years that Dolgin has been in the industry, the last two or three represent a sea-change in terms of the global regulatory environment, he said, which has enabled Abbott to pursue C&Q differently than in the past.

Namely, Dolgin espouses a hybrid approach to C&Q, which he defines as the verification of systems and facilities that combines the principles of ICH Q9 and ASTM E-2500, “implemented with selected, traditional Pharmaceutical Quality System mechanisms.” In other words, it is a marriage of the old and new.

ASTM E-2500, Dolgin reminded the audience, was the first consensus standard using science- and risk-based approach to verification. “Verification” is an umbrella term taken from regulation, and is used to describe both commissioning and qualification, he noted. “ASTM does away with IQ and OQ as mechanisms and goes straight from commissioning to PQ,” he said. Most importantly, it redefines Quality and Engineering roles—that is, “QA is assurance not control . . . an audit role, not a direct management or execution role.”

ASTM streamlines C&Q and puts the onus on engineers to do their jobs, Dolgin noted: “Bad engineers cannot be made better by adding more QA personnel that are not qualified engineers.”

ICH Q9’s role is to clearly lay out the expectations that FDA and other bodies have today regarding risk assessment and related control strategies. “Some inspectors are actually interested more in the risk assessment and control strategies than in the actual validation package itself,” he said, “which is certainly different than 10 or 15 years ago.”

Transitioning to Science and Risk

There’s a clear need in the industry for change, Dolgin said, to be efficient and lean, and thus to adhere to science- and risk-based approaches. This new approach allows “greater facilitation and fostering of innovation.” That attitude of “we can’t change something because it’s validated . . . is a very anti-progress kind of approach.”

On the other hand, Dolgin said, it’s not reasonable for manufacturers to adopt a pure ASTM-oriented approach, or an ASTM/ICH one. Why? The simple answer, Dolgin says, is that it costs money to change. Abbott, for example, “has had its quality systems and structures in place for a very long time, and there are differences in corporate cultures and capabilities.” For smaller and less experienced organizations especially, “it may be necessary to use the quality-dominated process that we have used traditionally.”

Dolgin is co-leading the development of an ISPE guide for “transitional activity” in moving to new C&Q approaches. “No one can justify costs of change just to change,” he said, and thus a gradual transition to better practices is warranted. “It’s not getting rid of IQ or OQ as terms,” he said, “it’s getting rid of wasteful, bureaucratic practices that have developed over the years in your company.”

Within industry “there is a growing expectation for risk-based approaches—and these approaches can have a positive impact on lifecycle costs!” he said. “It’s 2010, not 2001. Traditional approaches need a risk-based update.” Thus the ISPE’s vision is a not an overhaul of existing C&Q practices, but a “bridge document” to science- and risk-based practices.

C&Q Case Study

The specific case that Dolgin discussed relates to an Uhlmann UPS 1020 blister pack machine in Abbott’s Barceloneta, Puerto Rico plant. The project was designed to add two automated visual inspection systems (for blister forming and cartoning) as part of a CAPA, and to replace an obsolete controller.

The project, not surprisingly, had to be done within a constrained timeline. For internal reasons, the automated controller installation and debugging needed to be done on site and within the tight timeline, Dolgin said. The debugging time was an unknown and had the potential to have a significant impact upon the project timeline. Due to some information gaps, qualification protocols could not be generated until the debugging was completed.

Finally, resources were limited—key personnel could not be pulled off their projects to conduct lengthy risk assessments, he said. What’s more, the project had to work around both formal procedural requirements, as well as cultural norms, already established in Barceloneta.

The basic steps for the project included:

  • A simplified risk assessment used to determine the critical components or functions that could affect patient safety or product quality. “At a plant level, we own risk to specification,” Dolgin noted. “This ultimately links to risk to patient, but we can’t sit around as a review board and make clinical decisions as to what’s safe for the patient.”
  • Qualification requirements were determined and planned based on the risk assessment information. “What aspects of the Uhlmann process were GMP critical or product critical? We focused on those things.”
  • Commission data was then generated following Good Documentation Practices to enable it to be leveraged as part of the Qualification package.

There were three primary elements to this C&Q approach: User Requirements Specifications and Functional Specifications; Risk Assessment; and the C&Q Plan

Dolgin spent time elaborating on the Focused Risk Assessment. First, it was used to establish and categorize the quality impact of systems, based upon ISPE Baseline Guide 5 categories (direct impact, indirect impact, and no impact). “It’s a process-based impact strategy, rather than a yes/no one,” he said.

Once the system-level evaluations had been made, the functional aspects of Direct Impact systems were assessed. The project team defined risk levels (high, medium, or low) for each function, using GAMP 5 and BG 5 definitions. For the most part, however, these level determinations are subjective, he said, “based on the professional judgment of the team involved.” Print verification and blister sealing functions were both rated as high functions, for example. The team included in their documentation rationales as to how they arrived at these risk levels.

The Risk Assessment also relied upon an FMEA-style tool, which was “more efficient and less resource intensive effort than a full-blown Process FMEA.” The approach was possible, Dolgin noted, because it was not based on new technology. The team understood the Uhlmann, and the process, well.

The eventual C&Q plan was a 45-page document that included change information, C&Q roles and responsibilities, impact and risk assessment explanations, a critical elements list, and more.

Hybrid Protocols

Following the plan, all systems were commissioned, with GDP followed for the items planned for leveraging. Test results and data were evaluated by subject matter experts to determine if the data could be used as part of qualification. SME approval of test data was documented using a form. Once the data was evaluated and accepted by SME’s for qualification it became final and could not be modified.

Data that was not accepted by the SME’s could be re-executed in commissioning or tested through a qualification protocol. These qualification protocols were not pre-approved until the commissioning data had been reviewed.

Adhering to a hybrid mentality for the project, IQ and OQ protocols were used in traditional fashion. What was not traditional, Dolgin said, “is that we did not make an attempt in these protocols to describe the test procedure to be followed in great detail.” He added, “It’s a routine activity for your organization. It’s not necessary for a non-engineer like me to approve the testing procedures developed by engineers.” You can turn those test procedures in on result rather than in a preapproval document.”

Dolgin joked that he knew there were those in the audience thinking, “You can’t do that! How can you let the engineers run off without Quality-approved acceptance criteria?” In fact, there are acceptance criteria, Dolgin said. It’s just that they’re inherent in the process. “That target is determined by the physics and chemistry of the design itself and is not subject to engineers fudging around,” he said. “Many, many of these engineering characteristics are fixed and there is not the opportunity to ‘move the target’ to match the actual result.”

Project Outcomes

An evaluation of the project upon completion showed that it met the aggressive timeline as well as plant productivity goals. Dolgin estimated that it saved two to three weeks off of the site’s traditional project timelines. The protocol was greatly simplified, and it eliminated the need for repeated testing “based solely on the date a test was originally performed.”

Among the lessons learned: The approach relies heavily upon a plant’s engineers, Dolgin said. There are engineering prerequisites, in fact: good Engineering Practices, good Engineering Change Management, and good Vendor Management. “There are guidance documents available around GEP, but good engineering practices are just the way that good engineers do their work.”

A final lesson learned, according to Dolgin, is that whatever hybrid approach to C&Q is developed, it may need to change next time. “One approach doesn’t fit any and every project,” Dolgin said. “You have to be flexible.”

About the Author

Paul Thomas | Senior Editor