The evolution of analytics

May 5, 2020
How advanced analytics is being used to drive desired business outcomes in pharma

Developing quality medicines requires an agile business strategy that effectively utilizes technology to position a company for success. A proper tech-focused strategy can help shorten development times; establish robust quality within a well-defined design space of operation; provide predictive capabilities; optimize facility size and operation; and provide batch size flexibility to match production with market demand.

Pharma and other industries have adopted many enabling technologies over the years. These technologies have a proven track record of facilitating the production of high-quality products at ever lower cost using less manufacturing space. But other industries, such as chemicals and food & beverage, have reaped additional benefits by using innovative continuous manufacturing strategies that utilize targeted process analytical technology (PAT) coupled with data analytics applications.

Pharma needs to match or exceed the gains realized in other process industries. A significant step in this direction is utilizing available tools to make the right process and product measurements, storing this valuable data and then accessing it for near real-time decision making by subject matter experts (SMEs).

This article focuses on several real-world examples where this was done using advanced analytics to closely coordinate SME expertise with computing power for driving desired business outcomes.

Evolution of data analytics for process industries

The eras of analytics

There was a time in the not so distant past when pen and paper, clipboards and slide rules were the best tools available for data analytics. Looking back, this reality appears simplistic, but these tools were used to send a man to the moon, and by drugmakers to scale the manufacturing of penicillin and many vaccines.

Then came technology in the form of calculators and computers, which offered unheard of and broadly available computing power. These tools were less tangible to the masses than pen and paper, and many required cooperation with IT professionals. But these innovations opened new doors for data analytics.

These innovations ushered in the era of spreadsheets and modeling programs for data analytics, opening new avenues for improving manufacturing processes too, albeit with increased complexity. These new calculation capabilities and manufacturing innovations enabled some significant amazing achievements, such as scaling production of biologics from roller bottle technology to the 12,000L scale.

It wasn’t long before the broader adoption of these computing technologies, coupled with rapid innovation for making process measurements using PAT, led to the “big data” era, which created so much data and so much opportunity that there was also rising angst about what to do with it. Complexity grew, with an increasing need to manage data storage, security and data access — not to mention creating business value from the data. Meanwhile, the pressure to provide even safer medicines, more quickly and at lower cost, had risen rapidly.

During this era of computing advancements, one unfortunate side effect was the disruption of the direct connection SMEs once had with the data. Whether process data, quality metrics, shift timing, environmental data or any other time-series data, a crucial requirement is having data directly accessible by SMEs.

In the big data era, the extent to which the SMEs could operate at the speed of thought was greatly reduced or eliminated. This was due to the need for others to constantly resupply data to the SME, and for the SME to then time-intensively wrangle it using spreadsheets. Companies became bottlenecked trying to create data analytics super-users with a rare combination of process and IT skills.

With the recent advent of advanced analytics software applications, data is once again directly at the fingertips of SMEs. This has created an era of hands-on, near real-time and broadly useful analytics by coupling subject matter expertise with machine learning innovations. This new era could not have come at a better time as innovations in manufacturing technology, drug shortages and new regulations have introduced daunting challenges. Connecting human expertise and machine learning addresses these issues by creating an environment for finding insights to improve manufacturing processes, and to easily share these insights with regulators.

Advanced analytics applications

Why is SME connectivity to data and the corresponding use of advanced analytics crucial? Guidance from the U.S. Food and Drug Administration on process validation, for example, requires “…continual assurance that the process remains in a state of control (the validated state) during commercial manufacture.” As an example, a single continuous manufacturing campaign making several million tablets may have as many as 30 parameters being tracked, resulting in over 50,000 data points generated within a relatively short time period. This reality cannot be supported by the old era of waiting for someone else to manually supply data to the SME, much less the reliance on paper to support the required near real-time decision making.

The process validation life cycle can be viewed in three phases: process design, process qualification and continued process verification (CPV). CPV in the context of this magnitude of continuous data sources is both resource- and time-intensive. Data historians can record and manage this data, but advanced analytics applications must be utilized to visualize data, and to perform statistical and predictive analysis. These applications enable more efficient identification of data trends and understanding of process variability, thus allowing the necessary modifications to ensure process control during commercial manufacture.

Related to CPV, the key elements to demonstrate process understanding and monitor trends related to long-term reliable operation are statistical process control (SPC) and statistical process monitoring (SPM). SPC implementation must have a strong foundation of science and risk-based analysis. This is a key part of utilizing human expertise in the organization, which can then be leveraged to identify critical aspects of the process, implement reliable process metrics and establish a relationship between a robust process and quality products.

The following case studies illustrate the value created by this new era of advanced analytics.

Process monitoring and verification

ISA-88, the International Society of Automation standard for batch control, is important when implementing procedure monitoring as part of an SPM strategy. Within a batch process, there are recipes, procedures, unit procedures, operations and phases. In this case study, an SME summarized batch processes by aggregating metrics around process duration. Analysis steps included:

  • Identifying downtime within and between batch procedures
  • Creating key process indicators (KPIs) for duration of operations and downtime
  • Creating KPIs for each batch ID
  • Aggregating downtime to show daily summary of procedure performance

Additionally, the SME uses procedure monitoring to track critical process parameters, or recurring bad actors within boundaries. Boundaries are calculated using SPC rules for continuous processes or created around a golden reference profile for batch processes. In this example, there is batch information, process data (e.g., temperatures, pressures, valve positions), product tracking data (e.g., mass flows, classifications), quality analysis data (e.g., drug concentrations, drug attributes) and batch event data (e.g., recipe parameters, user actions, production state, alarms). A similar advanced analytics approach is used to identify a golden batch reference profile as follows:

  • Determine ideal process segment(s) for calculating SPC limits or developing a golden batch reference profile
  • Review selected inputs
  • Set boundaries on signals based on calculated limits or golden batch reference profiles
  • Monitor variables to identify deviations from a golden batch reference profile (or from control limits)
  • Develop historical deviation reports

Figure 1 illustrates the outputs of the analysis, including the boundary signals for standard deviations of the max motor speed per cycle, with the extended SPC analytical approach providing a strategy for monitoring various product grades, and the golden batch reference profile used to monitor batch reactor temperature during the heat up phase.

FIGURE 1

This advanced analytics application was used to find periods where the system was out of control, apply the SPC strategy across several product grades and overlay the golden batch reference profile in near-real time.

Batch process optimization

In this example, batch process optimization relies on the SME to quickly analyze the upstream manufacture of an active pharmaceutical ingredient (API) within a crystallizer and during protein production within a bioreactor. The value of cycle time improvements is high, but these types of batch processes are difficult to analyze due to the number and complexity of phases within the process. Monitoring the duration and variability of these phases, as well as downtime within and between batches, is tedious and requires constant updating of spreadsheets as new batches are produced.

To perform cycle time analysis, SMEs typically must manually sort through hundreds of historical batches in spreadsheets to determine the slowest or most variable phases of the processes. Cycle time analysis is frequently used for batch API reaction processes, and for phases including reactor charging, heating, and reactor emptying. Dead time, the non-value-added time within or between batches, must also be analyzed and reduced. Advanced analytics provides near real-time metrics for the duration of each phase, allowing them to determine the amount of dead time in every batch, as shown in Figure 2.

FIGURE 2

Batch cycle time monitoring analytics provide SMEs with the insight needed to identify the best cycle time for each phase, and to create metrics for each cycle.

Continuous process monitoring

Continuous manufacturing holds promise for cutting costs while providing better product quality, increased yield and more throughput.The objective in this example was to utilize a design of experiments to create a multivariate quality by design (QbD) model for a drug product twin screw wet granulation process.

Feed rates were collected for each additive, and quality measurements of the resulting granules were taken at the granulator outlet. The SME was able to directly connect to the disparate datasets and perform analytics across multiple phases within the process. The resulting models and boundaries were used for continuous monitoring of KPIs in near-real-time to support these efforts.

There was a time when pen and paper was state of the art. Technology innovations ushered in a new era of big data, with the complexity around managing data and performing analytics creating unwanted side effects related to the challenge of finding actionable insights.

These and other challenges are being addressed with advanced analytics applications that directly connect the SMEs to the data. Collaboration between SMEs and data scientists, aided by advanced analytics, is helping pharma manufacturers produce medicines more quickly and safely at lower costs.

About the Author

Lisa J. Graham Vice President | Analytics Engineering