Advanced Analytics Improve Biopharma Operations

Companies that use advanced analytics to improve operations have the potential to transform the biopharmaceutical manufacturing industry

By Eric Auschitzky, Alberto Santagostino and Ralf Otto, Pharmaceutical Operations, McKinsey & Company

1 of 2 < 1 | 2 View on one page

Success in biopharma has historically depended more on research and product innovation than just about anything else. Both will remain critical in the future, but operational performance will become increasingly important as competitive and cost pressures mount. To achieve operational excellence, biopharma manufacturers must develop capabilities in advanced analytics.

McKinsey’s experience working with top biopharma manufacturers shows there is a significant opportunity to improve biopharma operations. Many manufacturers are overwhelmed by the complexity of their operations, and complexity often causes significant process variability.

Consider, for example, the production of biopharmaceuticals. Manufacturing biopharmaceuticals requires sophisticated operations due to the use of live, genetically engineered cells as well as a highly technical manufacturing process with multiple steps. As a result, manufacturers often monitor hundreds of upstream and downstream parameters to ensure the quality and purity of the ingredients as well as of the substances being made.

Two batches of a particular substance, produced using an identical process, can still exhibit a titer and yield variation between 50 and 100 percent. This huge, unexplained variability can negatively affect productivity and quality as well as increase regulatory scrutiny.

ELABORATE STATISTICS AND OTHER TOOLS
Advanced analytics is the application of elaborate statistics and mathematical tools to business data in order to assess and improve business practices (see Figure 1). We see global manufacturers in a range of industries and geographies using these tools to improve their yield, thereby underscoring the opportunity for biopharma.

 
Manufacturers, for example, have an abundance of real-time, shop-floor data and historical process data. They are beginning to take deeper dives into this data, aggregating previously isolated datasets and using complex statistical tools to identify patterns and relationships among discrete process steps and inputs. They are using the resulting insights to optimize the factors that have the greatest effect on yield.

Robust datasets support predictive modeling of yield levels — which is a significant differentiator (see the sidebar “Predictive modeling using neural networks”). If the process and environment dataset is exhaustive enough (the dataset is broad enough), statistical significance is high enough (the dataset is deep enough), and the noise level is low enough (the dataset is clean enough), then advanced mathematical tools such as artificial neural networks can be designed for this purpose.

A REALISTIC TARGET
Process variability is often misperceived as intrinsic to biopharma processes. “We use live cells, which makes our processes highly variable” is a common refrain. Despite the strength of this myth, it has been repeatedly demonstrated that advanced analytics can significantly improve processes and decision making for biopharma manufacturers. Regardless of high process complexity, companies using these tools are reducing variability in product quality while also lowering costs and increasing sales.

Here’s a case in point. One top-five vaccine maker used advanced analytics to significantly increase its yield in vaccine production while incurring no additional capital expenditures. The company segmented its entire process into clusters of closely related production activities; for each cluster, it took far-flung data about process steps and the materials used and gathered them in a central database.

One team “on the ground” ran workshops and focus groups with local experts to draw an initial issue tree hypothesizing which parameters had the greatest influence on performance. A second advanced analytics team then tested that hypothesis using the company’s centralized and cleansed data.

In parallel, the advanced analytics team applied various forms of statistical analysis to determine interdependencies among the different process parameters (upstream and downstream) as well as their impact on yield. The team on the ground explored the details of the process (e.g., physics, chemistry, biotechnology) to validate the outcomes of the advanced analytics (in other words, to explain the “why”). This step proved critical in helping the initiative avoid wrong conclusions.

Through iterative loops between the two teams, nine parameters were identified as the most influential. Time to inoculate cells and conductivity measures associated with one of the chromatography steps were proven to be particularly important. Targeted process changes to better control for these nine parameters produced quick results. The manufacturer was able to increase its vaccine yield by more than 50 percent — worth between $5 million and $10 million in yearly savings for a single substance, one of hundreds it produces globally.

To achieve a similar boost, most biopharma companies still need to lay the foundation for a strong analytical capability. They currently lack the systems and capabilities to identify the addressable causes of variability in manufacturing. For example, they collect vast troves of historical process data — but typically use it only for tracking purposes, not as a basis for improving operations. Moreover, the collected data is not comprehensive and is stored in multiple databases that are not compatible with one another.

1 of 2 < 1 | 2 View on one page
Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments