QbD: Redefining Time to Market

For manufacturers who lack formulation and process knowledge, the benefits of getting to market fast can be outstripped by the costs of unexpected failures.

By J. Paul Catania, Tunnell Consulting, Inc.

1 of 2 < 1 | 2 View on one page

We’ve all seen articles speculating on pharmaceutical Quality by Design’s impact on drug development and speed to market. However, speed to market is simply part of a larger strategy to maximize profitability. The sooner the product is on the market, the sooner research and development costs can be recouped.

There are costs associated with accelerating late-stage development, scale-up and tech transfer to manufacturing. Isn’t what we’re really talking about cost to market versus the competitive advantage of getting there sooner?

Where does QbD fit in? It isn’t about getting to market fast; at least not directly. QbD is about getting to market reliably. QbD is about knowing enough about the limitations and risks associated with formulation and production methods in order to establish appropriate mitigation and contingency plans.

Organizations that go to market fast with limited formulation and process knowledge risk disruptions whose cost and time losses will quickly outstrip the advantage of being there early.

The case study that follows will allow us to see both the cost of disruption and get a good estimate of what it might have cost to have applied QbD methodologies earlier in the life cycle of the product. It will also present manufacturers with options in choosing the right point in the life cycle to apply QbD, and with solutions for what to do with a problematic product already on the market. Finally, I will discuss how the application of QbD tools and tactics is not just a vehicle to improve cost but represents a significant opportunity for organizational development that improves cross-functional coordination in product development and lays the foundation for QbD to occur earlier in the development process.

Case Study: A Closing Window
A major pharmaceutical manufacturer had an opportunity for six months of patent exclusivity against generic competition if it could launch a controlled release product extension by a given date. Annual sales were projected to be in the neighborhood of $100 million—not a blockbuster but still a significant opportunity, since the six months would enable $50 million in sales.

Because this product was controlled release, its production process would be complicated but the developer’s project plan to scale up, validate and transfer the process for the line extension to manufacturing put it comfortably within the launch window. As such, Sales and Marketing obtained purchasing commitments from their distribution channels based on Operations’ commitments to fill the pipeline and maintain stocking levels. 

The product was in commercial production mode and launch quantities were being produced to fill the pipeline when its dissolution rate began to trend out of specification. While the origin of the problem was unknown, a stopgap measure in production mitigated the effect to some extent. The downside was that the stopgap measure resulted in significant yield loss. Batches lost due to outright failure and yield losses due to this measure cost $250,000 a month as the organization scrambled to produce launch quantities in the face of having to replace the lost production time and materials. Not only was it costing more to produce the product, the launch window was now in jeopardy. This not only put the $50 million in exclusive sales in jeopardy, but agreements with distribution channel partners exposed the organization to potential penalties of $400,000 a day if it failed to meet its stocking commitments.

As the company’s Operations department dumped more money and resources into expediting materials, rescheduling production and working overtime, QA and Development had to divert resources to address deviation reporting and troubleshooting. Meanwhile, the regulatory department wrestled with the question of whether it should file a CBE-0 to institutionalize the stopgap measure which resulted in much lower yields and, therefore, much higher production costs and lower profits.

So, here is where QbD reenters the conversation. Quality by Design is nothing more than risk mitigation through process understanding. It’s pretty obvious that the earlier in development and commercialization that this process understanding occurs, the sooner the risk is quantified and, if not mitigated, at least understood such that appropriate management and cost allocation contingency plans can be put in place. This case illustrates the painful and multiple collateral effects across the organization when risk ends up being identified through unplanned failure.

Calculating Costs
In order to estimate what QbD would have cost in this case, consider the activities, resources and cost associated with the emergency application of QbD methodologies to troubleshoot and correct the problem. Four full-time employees were deployed for fifteen weeks. The team was comprised of a formulation scientist, a process engineer and a statistician led by a senior project manager with experience in all three of these areas. With the exception of their full-time dedication and focus on this one project, the team employed tools and tactics much like those used by a team pursuing proactive QbD. 

The first twelve weeks were consumed by data collection, database construction, creation of control charts of input and output variables, hypothesis generation, multivariate analysis and the design of experiments to perform hypothesis screening and confirmation. Carrying out Design of Experiment (DOE) work consumed the last three weeks. It should be mentioned that in this case, as in many cases, the team needed to spend a disproportionate amount of time in data collection because much of the data needed to be manually transcribed and those that were in electronic format were spread among multiple databases.

The resulting process understanding yielded a solution which did not require continuation of the stopgap method with its associated yield losses and there was no need for a supplemental filing. The QbD methodology identified and confirmed that the dissolution problem was related to an API characteristic which was not previously thought to be critical. A change in suppliers had resulted in a subtle change in this characteristic which, while still within specifications, shifted dissolution performance resulting in all the costs and problems outlined above. 

1 of 2 < 1 | 2 View on one page
Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.


No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments