Success in biopharma has historically depended more on research and product innovation than just about anything else. Both will remain critical in the future, but operational performance will become increasingly important as competitive and cost pressures mount. To achieve operational excellence, biopharma manufacturers must develop capabilities in advanced analytics.
McKinsey’s experience working with top biopharma manufacturers shows there is a significant opportunity to improve biopharma operations. Many manufacturers are overwhelmed by the complexity of their operations, and complexity often causes significant process variability.
Consider, for example, the production of biopharmaceuticals. Manufacturing biopharmaceuticals requires sophisticated operations due to the use of live, genetically engineered cells as well as a highly technical manufacturing process with multiple steps. As a result, manufacturers often monitor hundreds of upstream and downstream parameters to ensure the quality and purity of the ingredients as well as of the substances being made.
Two batches of a particular substance, produced using an identical process, can still exhibit a titer and yield variation between 50 and 100 percent. This huge, unexplained variability can negatively affect productivity and quality as well as increase regulatory scrutiny.
ELABORATE STATISTICS AND OTHER TOOLS
Advanced analytics is the application of elaborate statistics and mathematical tools to business data in order to assess and improve business practices (see Figure 1). We see global manufacturers in a range of industries and geographies using these tools to improve their yield, thereby underscoring the opportunity for biopharma.
Manufacturers, for example, have an abundance of real-time, shop-floor data and historical process data. They are beginning to take deeper dives into this data, aggregating previously isolated datasets and using complex statistical tools to identify patterns and relationships among discrete process steps and inputs. They are using the resulting insights to optimize the factors that have the greatest effect on yield.
Robust datasets support predictive modeling of yield levels — which is a significant differentiator (see the sidebar “Predictive modeling using neural networks”). If the process and environment dataset is exhaustive enough (the dataset is broad enough), statistical significance is high enough (the dataset is deep enough), and the noise level is low enough (the dataset is clean enough), then advanced mathematical tools such as artificial neural networks can be designed for this purpose.
A REALISTIC TARGET
Process variability is often misperceived as intrinsic to biopharma processes. “We use live cells, which makes our processes highly variable” is a common refrain. Despite the strength of this myth, it has been repeatedly demonstrated that advanced analytics can significantly improve processes and decision making for biopharma manufacturers. Regardless of high process complexity, companies using these tools are reducing variability in product quality while also lowering costs and increasing sales.
Here’s a case in point. One top-five vaccine maker used advanced analytics to significantly increase its yield in vaccine production while incurring no additional capital expenditures. The company segmented its entire process into clusters of closely related production activities; for each cluster, it took far-flung data about process steps and the materials used and gathered them in a central database.
One team “on the ground” ran workshops and focus groups with local experts to draw an initial issue tree hypothesizing which parameters had the greatest influence on performance. A second advanced analytics team then tested that hypothesis using the company’s centralized and cleansed data.
In parallel, the advanced analytics team applied various forms of statistical analysis to determine interdependencies among the different process parameters (upstream and downstream) as well as their impact on yield. The team on the ground explored the details of the process (e.g., physics, chemistry, biotechnology) to validate the outcomes of the advanced analytics (in other words, to explain the “why”). This step proved critical in helping the initiative avoid wrong conclusions.
Through iterative loops between the two teams, nine parameters were identified as the most influential. Time to inoculate cells and conductivity measures associated with one of the chromatography steps were proven to be particularly important. Targeted process changes to better control for these nine parameters produced quick results. The manufacturer was able to increase its vaccine yield by more than 50 percent — worth between $5 million and $10 million in yearly savings for a single substance, one of hundreds it produces globally.
To achieve a similar boost, most biopharma companies still need to lay the foundation for a strong analytical capability. They currently lack the systems and capabilities to identify the addressable causes of variability in manufacturing. For example, they collect vast troves of historical process data — but typically use it only for tracking purposes, not as a basis for improving operations. Moreover, the collected data is not comprehensive and is stored in multiple databases that are not compatible with one another.
Beyond these data limitations, many biopharma companies lack the skills and tools necessary to develop actionable insights from the data they have. Even if some statistical analysis of processes and tentative correlation of parameters are occurring, the tools used are typically not up to the task at hand. ANOVA, single-variable correlation or other Six Sigma tools are not sophisticated enough to handle the multidimensional and highly complex manufacturing processes of biopharma.
With gaps in the data, skills and tools required for advanced analytics, biopharma companies often have an incomplete understanding of their performance. Furthermore, they may not be able to fully seize the opportunities that they do identify.
Biopharma companies that do build proper advanced analytics capabilities could forge an advantage in manufacturing that will differentiate them from their competitors. To help companies understand that analytics journey, we outline a standard and granular approach here.
1. Create the conditions for analysis.
Gauge the potential for improvement by using historical data to estimate the size of the gaps between average and best-case performance. To start, aggregate data from every available source across the organization into a single, exhaustive database. Map the organization’s operations from end to end to account for every aspect of each production process. It is critical to ensure that the data is high quality and that enough data is collected to generate relevant insights. Then segment the data into clusters (e.g., fermentation) of closely related activities that can be analyzed as coherent units. For each cluster, list every process parameter and material characteristic.
2. Analyze data and develop insights.
Once a threshold of data is gathered and segmented, use a variety of advanced statistical tools to identify improvement opportunities and spot trends (see Figure 2).
Correlation analysis can be used to identify relationships and linkages among process parameters. Standard statistical analyses — including moving averages, distribution histograms, standard deviation, and clustering analyses — can be used to identify patterns and prioritize data that has the most predictive power. Statistical significance analysis can be used to test initial hypotheses about the root cause of titer and yield variability and identify relationships among parameters that were not surfaced by correlation analysis. And artificial neural network analysis, which seeks to emulate the structure and functional aspects of biological neural networks, can be used to model complex processes and determine with greater precision how particular parameters affect productivity.
Accelerate progress by conducting workshops with biopharma experts to investigate the trends, correlations, and other phenomena identified. This will foster a deeper understanding of the underlying biopharma parameters along the value stream.
3. Build an action plan
In many cases, opportunities will be identified that are worth pursuing. Rank these opportunities based on potential impact and the effort required to implement. Then prioritize opportunities with the highest impact that are easy to implement. Develop clear initiatives to capture these priority opportunities. These initiatives should include provisions for training and coaching employees at all levels in the organization to ensure they develop the necessary capabilities and mindsets. Also include provisions for monitoring performance to ensure that the initiatives are implemented properly and on time, and provide mechanisms to help teams correct course when they encounter challenges. It is critical to assign clear lines of accountability for every aspect of each initiative.
Executing in waves is recommended. Consider this approach to allow teams to learn on the fly and refine their approaches as they roll initiatives out in full.
Here is a second case in point. A biopharma firm that used this approach increased its yield by 30 percent by stabilizing cell growth. This improvement, triggered by insights gained from analytics, derived largely from improvements in cell storage and adopting more effective standards for discarding expired media used to grow cells.
Manufacturers that use advanced analytics to improve their operations have the potential to transform the industry, establishing new standards for efficiency while reducing costs and increasing sales. The resources they free up can then be poured back into research/product development, fueling their growth far into the future.
ABOUT THE AUTHORS
Eric Auschitzky (eric_auschitzky @mckinsey.com), Alberto Santagostino (alberto_santagostino @mckinsey.com) and Ralf Otto (email@example.com) are leaders in McKinsey & Company’s Pharmaceutical Operations practice.