Process Optimization Reality Checks

May 18, 2006
What works in the lab may not always work after scaleup, says contributing editor Angelo De Palma.

Process optimization encompasses widely varying activities whose principal goal is to reduce costs by eliminating process steps, improving yields, shortening cycle times and producing higher-quality product. Companies with limited manufacturing capacity must look inside their manufacturing and logistic operations to squeeze more product from existing assets.

Process improvements usually range from the difficult to the next-to-impossible, so it is essential to address them as early as possible in the product’s life cycle. Dowpharma (Midland, Mich.) engages chemists and manufacturing engineers once the molecule enters the development pipeline, says Jeffrey Dudley, global business manufacturing director. “If we don’t, we risk getting stuck with very slick chemistry that’s a bear to run at production levels, simply because no one has thought enough about turning a reaction into a process,” he says. “You can’t scrape the sides of a reactor like you can a beaker.”

20-liter bioreactors for fermentation process development. Courtesy of Dowpharma.

As a first step, Dow likes to keep unit operations and chemical transformations as simple as possible by avoiding “university” reactions and obscure reagents. Next, downstream purification offers the most accessible improvements, according to Dudley. A team at Dow introduces what he calls “enabling” purification methods that provide meaningful yield improvements while helping to streamline the overall process. In addition, Dowpharma avoids purchasing equipment for one-off projects, or machinery that cannot be used by other processes. As for unit operations, Dudley avoids those that would restrict the use of facilities for other products.

Dow makes extensive use of modeling and high-throughput, parallel experimentation for all manufacturing-related activities. Everything, from reaction kinetics to separations, plus specialized operations like drying, solvent recovery and solids handling, goes into a model generated by the company’s Engineering Science division.

Any contract manufacturing organization (CMO) typically faces a gamut of process optimization projects, from being handed back-of-the-envelope chemical structures to fully developed processes. Many processes are fully functional when they arrive, but some “are so bad they make you weep,” Dudley quips. “We had one customer come in with a 17-step process that we had to cut to seven steps.” When those situations arise, his group will often invest their own resources to demonstrate a more efficient process to the client. Convincing customers to change, Dudley says, takes strong science and not a little persuasion. Demonstrating that the new process will require a more modest capital investment or contracting fees, in the case of outsourced production, generally works.

Optimizing a multistep chemical/pharmaceutical process requires balancing what was learned during discovery and chemical development against the desire for a robust process. Unfortunately, development scientists too often fall back on what worked in the lab. As a result, they may not have collected the right information, or examined it in a way that applies to the scaled-up process. “Then they’re disappointed when they don’t achieve the same performance,” says Lionel O’Young, Ph.D., CEO of process development firm ClearWaterBay Technology (Walnut, Calif.).

An example of this is in solvent selection. Bench chemists typically focus on reaction solvents, ignoring the impact of extraction and crystallization solvents on the process, and on product purity. Removing trace solvents is routine at the bench, but much more difficult in a finished product. “Whatever you put into the process must be removed, so one must consider its contribution,” O’Young observes. “Sometimes it’s better to use the second-best solvent when scaling up.”

Consider driving forces

Another scaleup mistake related to optimization is failure to reckon with or combine what O’Young calls “driving forces” — factors that favor product over raw materials or impurities. The driving force in distillation is the vapor-liquid equilibrium; for crystallization, it is the solid-solute equilibrium.

Chiral purifications provide a good example of exploiting multiple driving forces to advantage. Chiral chromatography is expensive, time-consuming and solvent-intensive if the goal is 99.9% optical purity, while chiral resolution is inefficient and may introduce impurities at an inopportune time. But by combining these two operations, development scientists can often reach the desired purity objective quickly: A quick pass through a chiral column that yields 95% enantiomeric excess, followed by complexing the unwanted isomer with a chiral reagent, might do the trick.

One complaint often heard from scaleup specialists is that molecules tend to be thrown “over a wall” between research and development, development and pilot plant, pilot and commercial manufacturing. This is in part a result of the competition, and lack of communication, between chemists and chemical engineers. “Although the objective of both groups is the best process for the best product, they don’t always communicate that idea,” O’Young observes.

Symyx (Santa Clara, Calif.), which is evolving into a drug discovery and process development company, offers optimization products and software for multi-dimensional process optimization that lessens the specialty segmentation across the development lifecycle.

Symyx uses robotics and parallel microreactors to test solubility, optimize catalysts and test reaction conditions with very small amounts of material. Tying the systems together is software that presents all relevant data on reaction conditions and yields, effectively bridging the knowledge gap between discovery, process development and pilot scale, and one hopes manufacturing. 

The company’s PPR (parallel pressure reactor) holds between 16 and 96 parallel semi-batch continuous stirred pressure reactors suitable for testing difficult chemistries such as hydrogenations. A high-pressure model handles even smaller reactions and a broader range of chemical and catalyst conditions. Symyx also offers a software suite for collaborative reaction optimization comprising modules for electronic notebook, automation, and data mining. Through it, development scientists can benefit from hundreds or thousands of optimization experiments on combinations of solvents, reagents, and reaction conditions.

Inside the Cambrex pilot plant in Charles City, Iowa. Courtesy of Cambrex.

Symyx is looking into ways to adapt its products to downstream development, to link lab through pilot scale. The company is looking at ways to incorporate intermediate-scale reactors, or perhaps wrap their software around third-party reactor products. The ability to run hundreds of reactions fosters creativity in process design by greatly reducing the cost per experiment, says Eric Carlson, Ph.D., director of product development. “Chemists traditionally try their three best ideas, and if none works throw their hands up,” he says. “Scientists who have 40 or a hundred experiments ideas to reflect on will often find something that was not on their original list.”

Post-approval modifications

For years, manufacturers dreaded the idea of changing a process after product approval. Now, through its science- and risk-based initiatives, FDA has been increasingly open to post-approval process modifications. “They’re pretty easy to work with,” says Ron Carroll, Ph.D., VP for pharmaceutical technologies at Cambrex (East Rutherford, N.J.), “as long as you can present a rationale and have the data. When you don’t, or if they sense you’re trying to slip something by them, is when you get into trouble.”

A minor change requiring only mention in the annual update might be increasing the concentration of a reactant or the final product to reduce batch size. Provided that QA documents the impact as minimal or undetectable, the agency will go along. A more complex situation arises when switching from one brand of equipment to another to achieve, say, more rapid drying. Since drying affects final product composition, FDA expects the sponsor to file an engineering supplement. Approval may take months, but is required before the change may be implemented.

An intermediate scenario, applicable mostly for small molecule manufacture, involves complex but straightforward changes like moving the process to a different location. The new site must be validated, but as long as the chemistry and equipment is the same, this can be handled through a CBE 30 (changes being effected in 30 days) filing.

Is biotech different?

Sponsors have an easier time incorporating some process wiggle room into small-molecule regulatory applications than for biologics. As a result, post-registration process optimization is less common in biotech than small molecule drug manufacture. The aphorism “the process is the product” did not come about for nothing.

Processes for monoclonal antibodies (MAbs) are usually set by the end of Phase II, effectively limiting the nature and scope of subsequent optimization efforts. “After Phase II, it is difficult, although not impossible, to improve a process for a monoclonal,” says Keith Dixon, Ph.D., director of bioprocess development at Pfizer (Sandwich, U.K.). “Companies considering it will generally hold off until post-launch, then introduce a completely new second-generation process.” Demonstrating comparability between products made by radically different processes is hardly straightforward (see "A Wave, Suspended: Follow-on Biologics," Pharmaceutical Manufacturing, March 2006) due to the complexity of therapeutic proteins.

With small molecules, a few spectroscopic and chromatographic tests tell a molecule’s entire story. It is not uncommon for manufacturers of even complex non-protein drugs such as antibiotics to launch a product and make multiple, substantial process changes over the next decade. By contrast, biomanufacturers can measure every conceivable protein property and still be uncertain about immunogenicity or efficacy. Even when everything appears to check out, regulators may ask for additional clinical studies to confirm safety and efficacy. Obviously, no one would undertake a substantial bioprocess redesign without very solid financial justification.

Some companies, mindful of FDA’s initiatives to inject risk management and science-based decision-making into manufacturing, are seeking to challenge the “process is product” mantra, says Dixon. “There is talk about adding design space in regulatory submissions, but even then one must define the limits of that space when registering the process.” Regardless, biomanufacturers will still need to demonstrate process capabilities and limitations, which is expensive and time-consuming.

Biopharmaceutical manufacturers enjoy a nearly endless list of optimization targets, from their cellular or microbial expression systems to fill/finish operations. Many, like Dowpharma's bioprocessing unit with its Pfenex expression technology, try to increase protein expression while simultaneously improving a protein's biological activity. According to Henry Talbot, Ph.D., senior R&D leader for bioprocessing, having protein in highly soluble form can help eliminate purification steps, thus intensifying cumbersome processes.

Dowpharma's product recovery scaleup lab.

Dowpharma, which produces vaccine antigens and antibody fragments in Pseudomonas fluorescens bacteria, also uses high-throughput, small-scale fermentation methods and off-the-shelf design of experiment software to optimize fermentations. At the end of the day, customers have a 20-liter process that can be transferred and scaled up rapidly. The entire process, from gene to pilot-scale batch, takes 10 to 12 weeks, with up to three additional weeks for fermentation optimization, says Talbot. Customers also receive a full technology transfer package, which they can apply at their site or at a CMO's.

Despite, or perhaps because of, its greater complexity, biotech will probably always be more limited than small-molecule pharma in which operations it can optimize, and to what degree, without incurring substantial regulatory review. Expediency often works against process excellence. In its rush to file a BLA (biologics license application), for example, a biopharmaceutical company may perform limited testing of protein A chromatography capture resins, which cost $10,000 per liter. If the BLA states that the resin will be recycled 10 times, and the company subsequently discovers that the $200,000 column can safely be used 30 times, that change will probably trigger a CBE-30, FDA’s “changes being effected in 30 days” supplement.

No magic bullet

For a given level of product quality, the impetus for process improvement is pure economics. Optimization, especially post-registration, must be justified through a thorough cost analysis that weighs allocation of necessary resources against a more attractive bottom line.

One would think that such return-on-investment decisions emerge only after prolonged contemplation and consultation with complex financial models. Not so, says Howard Levine, Ph.D., president of BioProcess Technology Consultants (Acton, Mass.). “The assessment often involves intuition and educated guessing, although companies are getting more sophisticated about it,” he says. Where potential savings are fuzzy or not easily quantified, companies may decide to come out with an improved second-generation product.

Process optimization should be viewed as one tool among many to achieve more profitable pharmaceutical manufacturing. “There’s no magic bullet,” says Ron Carrroll of Cambrex. “Process optimization takes place over many years, but once you register the process with the FDA you’re kind of locked in.”

MODELING AND SIMULATION

Modeling and simulation software holds a special place among process development engineers. The simplest modeling software, the spreadsheet, can handle some aspects of discrete unit operations but becomes unwieldy when multiple operations must function together.

An engineer’s idea of process modeling relates to overall process improvement, either how operations are run or fit together, or chemistry improvements (e.g., in raw materials) that achieve improvements in efficiency. Academics focus on a more mathematical approach that seeks a process’s maximum profitability point, says Charles Siletti, director of planning and scheduling applications for Intelligen (Scotch Plains, N.J.). For the engineering model, inputs include real-world equipment-related engineering parameters such as vessel size or flow rate. “The more detail you put in, the more reliable your predictions become,” Siletti says.

Scaleup is a great reason to invest in a modeling package. Leading software products predict heat capacities, mass transfer, and energy-related parameters that can cause serious mishaps when transferring a process between lab and pilot, or pilot and manufacturing. But ultimately the model is only as good as the operator, and the quality of its inputs.

Inteligen’s SuperPro Designer, an engineering design simulation package, evolved from an earlier product, BioPro Designer, which modeled bioprocesses. SuperPro sizes equipment, predicts materials and energy requirements, creates preliminary cost estimates and analyzes cycle times. The company’s SchedulePro fills the gap between cycle time analysis and process scheduling. Where SuperPro works with a single process, SchedulePro metes out resources for facilities that house multiple processes that share resources.

Scheduling is often overlooked as a formal exercise, but can reap serious benefits. Most scheduling today is accomplished through whiteboards and Excel spreadsheets. Siletti would not venture a guess as to how badly the “average” process schedules and allots resources, but it is safe to say that he is not impressed by prevailing practice. “My feeling is that scheduling is done in a way simply to get a plant up and running, with no emphasis on improvement.”

One pharma company that used Aspen Technology’s (Lawrenceville, N.J.) Plant Scheduler for a bioprocess increased the number of batches per week by 30% to 40% and cut the time for creating a manufacturing schedule by 50%, to one to two hours, says Peter Clark, applications consultant with Aspen Technology. Aspen also offers the Batch Plus recipe-based modeling environment, modules for individual unit operations and the Aspen Custom Modeler for more specialized user-written models.

While modeling can benefit many processes, it is important to recognize the tradeoff between the cost and time for building the model against the value of the software’s decision-making capabilities.

Six Sigma, PAT and Process Excellence

Not every approach to optimizing pharmaceutical manufacturing involves hard-core engineering. Lean-Six Sigma and related quality methods, which typically focus on high-level value-driven activities, boast a proven record for improving productivity in pharmaceutical manufacturing without touching a single valve. Although most top drug makers have jumped on the Lean-Six Sigma wagon, a notable few have not.

Increasingly, manufacturers are applying Lean-Six Sigma to repetitive tasks, shared resources and logistics, with great results. Utilization of analytical services, scheduling of cleaning and downtime for capital equipment, and in-process analytics are examples of plant activities whose optimization can reap huge benefits.

Another potential source of process excellence — depending on its implementation — is process analytical technology (PAT). As with operational excellence strategies, the top firms have embraced PAT while others appear to be going through the motions.

In a 2002 FDA advisory committee meeting, Ajaz Hussain, Ph.D., former deputy director at FDA’s Office of Pharmaceutical Science, noted the role of PAT in process optimization. PAT, Hussain said, improves efficiency by reducing product cycle and development times, enhancing capacity utilization and helping to deliver quality by design.

In practice, PAT can only help optimize a process if that was its original goal. A breathtakingly elegant PAT deployment on a clunky, wasteful process offers little solace to those who value mean, lean manufacturing. At the moment, PAT seems to work best for small-molecule manufacture, where the raw materials and products are extremely well-characterized throughout. Several years ago, Pfizer, a recognized leader in PAT, adopted PAT for biologics but today primarily applies it to small-molecule drugs.

An often-overlooked source of process improvement is in plant and facility design. Any process-related activity can be modeled and planned during plant engineering, says Ahmad Shahidi, a principal consulting engineer at CH2M Hill Lockwood Greene (Spartanburg, S.C.). Floor space, ceiling heights, utility installation and a thousand other design elements can help eliminate what Shahidi calls “common shortfalls” that range from minor annoyances to plant-closers. Unfortunately, plant construction occurs long before development chemists have a chance to “kick the tires” in a new process, and once it is over, it is over. Re-designs and expansions are not uncommon, but they are costly, highly disruptive and hardly ever undertaken with the sole idea of, for example, reducing the distance between a reactor and a chromatography suite.

About the Author

Angelo De Palma | Ph.D.