Putting Global Manufacturing Data at Your Fingertips

June 2, 2009
Implementing a new IT paradigm for on-demand data is as much an organizational issue as it is a technological one.

Knowledge management is a broad term, but can be thought of as leveraging information within an organization so that it becomes an asset. Knowledge is embodied information, which in turn is derived from data that has been acted on in context over time to produce results that provide learning.

For the purposes of this article, I refer to manufacturing data as one of the most important sources from which knowledge can be developed. I will focus on how “data-derived knowledge” can be developed and used to achieve manufacturing process excellence across a global manufacturing network using an on-demand process data access and analytics platform.

This article is based on a presentation I gave in June 2008 at the Seventh Annual Biological Production Forum in Munich, Germany. The talk examined the role of on-demand data access, aggregation and analysis in achieving process excellence and how a top international pharmaceutical company is rolling out this type of manufacturing system on a global level. The rollout has not been without its challenges, but in the end the platform has provided the manufacturer with the shared, global system of on-demand data it sought, and with several side benefits.

The focus here is on the business case and recommended steps for having IT work closely with manufacturing users. The recent IT paradigm shift and the change management and user retooling processes required are specifically examined—providing valuable insight for all process manufacturers considering implementations of enterprise-class on-demand data access and analytics systems.

Competitive and regulatory drivers for change

The “Desired State” of manufacturing has been articulated for a number of years as the condition where quality has been designed into the manufacturing process based on process understanding, so that continuous improvement. FDA has shown that it is willing to exercise risk-based regulatory scrutiny, based upon the manufacturer’s level of scientific understanding of quality and performance and its process control strategies to prevent or mitigate product risk.

In practice for manufacturers, this translates to a faster passage of your CMC section through reviews if you have sound scientific data underpinnings, and provides for fewer and less intensive inspections. FDA is very interested in this, because of its own limited resources for conducting inspections. The Agency’s new technology and regulatory framework expands future choices for manufacturers and allows them to innovate, while still minimizing risks to consumers.

There are revenue advantages and cost reduction opportunities associated with moving in the direction FDA wants us to—meaning building quality into the process or Quality by Design (QbD). Process analytical technology (PAT) is an enabling technology to achieve QbD, it is not an end in itself. You may not, in fact, even need real-time, online measurement to achieve QbD in some processes. So the obsession with having a particular online measurement when there is no real scientific basis for it is not a cost effective way of approaching QbD.

The strategic business goals associated with QbD lead to revenue enhancement and cost reduction. Specifically, the outcomes are:

  • Faster regulatory approval of new applications and process changes
  • Faster technology transfer between development and manufacturing
  • Shorter time to revenue to meet demand after start-up
  • Greater flexibility to lower cost of manufacturing changes
  • Increased predictability of manufacturing output and quality
  • Reduced batch failures, final testing and batch-release costs
  • Lower operating costs from fewer failures and deviations
  • Reduced inventory costs from raw material, WIP and FP
  • Fewer, shorter regulatory inspections from “regulatory flexibility”

All of these result in favorable bottom-line impacts for business. So, tactically speaking, how do we achieve these goals?

Tactical Objectives

1. Understand sources of variability in the process to improve control. Increase the predictability of product quality and yield by understanding the sources of variability and building a better model for controlling the process.

2. Enable investigational analysis across all types of data. Investigational analysis drives the understanding of cause-and-effect relationships vs. simply looking at the descriptive analysis that summarizes things with “dashboard-like” information that—while useful—does not necessarily tell you about cause and effect.
 
3. Support the collaboration between Process Development and Manufacturing. We sometimes forget that one of the most important sources of expertise of today’s processes are the Process Development efforts that led to those processes being launched into manufacturing to start with. Being able to collaborate helps move knowledge from and into Process Development and also helps Process Development teams understand the constraints of Manufacturing. This leads to faster access to prior experience and enables process understanding earlier in the process development cycle.

4. Provide process visibility across geographic boundaries. It’s a reality that processes often start in one part of the world and end in another. Upstream processing affects downstream outcomes, so we need to be able to do analysis that helps us understand that. With easy access to data from global processes, we can more easily do the cause-and-effect investigations that proactively lower failure rates.

5. Capture “paper-based” data in a Part 11 compliant way. In spite of the fact that manufacturing execution systems (MES) have been around for a long time, as have electronic batch record (EBR) systems, we still deal with a lot of paper. You can’t do data analysis with paper unless the data values are captured in electronic form.

From a user’s perspective, the following tactical needs must be met to achieve the previously outlined objectives:

  • Provide on-demand, interactive access directly by end-users to data in multiple databases and on paper records that are the sources of that data.
  • Supply descriptive (what happened?) as well as investigational (why did it happen?) analysis capabilities in the environment in which users access their data. When the indicator on the dashboard signals an alert, you’re going to want to know why.
  • Include all types of process development and manufacturing data in a combined form (i.e., discrete, continuous, replicate, event, keyword and free text data).
  • Build systems for non-programmers and non-statisticians to collaborate across disciplines and geographies. Very few end users are comfortable with command lines!

If these objectives are met using the right on-demand data access technology platform, you can have investigational and descriptive analytics integrated into the same environment to better understand the sources of process variability and develop better models for process control.

Case Study: Integrating Data and Analytics

How do we leverage the systems we already have to provide the kind of data access and analytics environment we want for deriving information to improve process predictability? In summary, an on-demand data access platform, connected and mapped directly to the operational data stores of our existing manufacturing data infrastructure, allows users to understand the sources of process variability and achieve better process control.

What follows is an account of how one global manufacturer gained valuable experience about using an integrated on-demand data access and analytics platform for its manufacturing environment. Over several years, the company came to grips with the complex issues and real-world requirements of accessing data and delivering it to manufacturing practitioners in relevant time. After initiating the rollout of an enterprise-wide deployment of a commercial, off-the-shelf (COTS) on-demand data access and analytics platform, the company learned a great deal about the associated critical success factors.

The New IT Paradigm for Global Manufacturing Networks

Company leadership saw a new IT paradigm that required a shift in the role IT played within its manufacturing organization. Decision-makers made a strategic move to focus on its core competencies—it was not a software company, after all. It committed to acquiring COTS packages versus developing software solutions in-house. This meant that IT team members became project managers implementing COTS software instead of developers writing and testing code—or even customizing COTS packages. This was not necessarily an easy change for IT staff, who had to adopt new roles.

The company also needed to leverage its existing investments in large IT systems. It had made a strategic decision to use a large enterprise resource planning (ERP) system and needed to provide access to data from that system to its whole manufacturing environment. It also settled on a particular data historian standard for its manufacturing sites. With investments made in these underlying sources of data, the company needed to see a return on investment.

The company also wanted to employ standards-based interfaces to get data, and it needed to address current business and regulatory needs—a big stretch for the IT department, which traditionally had been focused on individual site IT requirements. The company asked its IT resources to better understand its manufacturing business, and not just its own technology expertise. Management needed IT to help lead its business units by showing them the possibilities of IT systems because the business units may not know what IT capabilities existed to satisfy their needs. At this global company, most IT folks knew the needs of their own sites, but they did not understand the manufacturing technology—or what goes on from a business perspective—on a global level. This was the new paradigm for IT.

To level the playing field, manufacturing needed to reduce the burden on the end users who needed to be looking at data with standardized tools. These users needed a single common system to access, analyze and report on what they learned from their data. To level geographic barriers, the company needed resources to help interdisciplinary global teams do shared problem solving to help multiple sites readily do comparisons. For example, the company produces one product in two separate plants, where benefits would be derived from drawing conclusions based on a shared understanding of all the process data.

Critical System Requirements

In terms of critical system requirements, the organization used the S95 standard as a data framework to level systems and allow data to flow throughout the world. This provided a common leveling tool for disparate source systems as well as a communication tool for IT professionals. The company wanted to provide access directly to the data in its source systems whenever possible—without putting in a data warehouse as in intermediate layer. The servers providing access to local historian data needed to be on site and also connected to external data sources in a way that would give users full access to data from all over the world without unacceptable network lag.

Making the Business Case

Like the rest of the industry, this company faced pressure on manufacturing support to do more (or at least the same) with less. The amount of work to be done doesn’t go away, just because fewer people are working on the tasks.

Management knew that its scientists were spending more than 80 percent of their time gathering data with less than 20 percent of their time left to analyze it and make process improvements. The goal was to reverse those percentages. No one goes to a university to get an advanced degree in chemistry or engineering and thinks, “I want to work at a large pharmaceutical company and spend 80 percent of my time gathering data.” People found their tasks boring and difficult because it was hard to get data, and often, they didn’t end up using the data because they didn’t trust what they found.

The company wanted to facilitate quarterly rather than annual product and equipment reviews. Today, after deployment of the on-demand data access and analytics platform, it has realized a greater than 90 percent reduction in the amount of time required to gather data for the preparation of annual product reviews (APR). The system had to provide validated data sets that could be used in GMP reporting. Global data sharing was essential, since it had the same processes at multiple sites and needed to compare data and view the process from raw to bulk to finished goods in a single view. With faster access to data, the company wanted to move incident resolution from weeks to hours when a batch was held up by a deviation in the process.

Taking batch genealogy automatically into account was also one of Manufacturing’s priorities. One of its processes has more than 15 different process steps, each with different splits and recombinations, so looking at over a thousand combinations of process pathways manually was impossible if its scientists wanted to see what effect conditions in the first step had on process outcome. Users needed traceability of intermediates and bulk products from sites of manufacture to finished goods sites.

The company needed a system in which it could have line of sight for all end users from manufactured materials back through to process development. In this case, it put systems in place in manufacturing first and has plans to make the connection back to process development as a next step, using the same on-demand data access and analytics environment.

Manufacturing was replacing an in-house developed system that was managed by a staff of 20 IT people. It needed to replace it to combine data integration and analysis in a validated GMP environment that would end the “spreadsheet madness” required for investigational analysis. Previously, every scientist was using Excel spreadsheets with data scattered around on everyone’s desktop, from which they pulled reports. If the company made hula hoops or potato chips, that would suffice. But it needed something more robust that could stand up to a GMP audit.

The considerations for a new system included: ease of implementation, ease of use, ease of maintenance and GMP readiness. An independent consulting firm helped the IT department evaluate available COTS systems based on these requirements. It ultimately chose a commercially available, enterprise-class process intelligence platform for integrated on-demand data access and analytics.

One of the outcomes of this process was the realization that the data aggregation and analytics offerings of commercially available systems like data historians, ERP systems, enterprise application integration buses, transactional data buses and stand-alone statistics packages were not properly designed for the task. What was needed, and what is currently rolling out to its global operations, is a layer above these systems that provides integrated on-demand and scheduled data access to the operational data stores of these systems integrated with analytics that enable a much higher level of end-user functionality.

Implementation: Change Management and User Retooling

When the global manufacturer rolled out the new system, it started with a proof of concept stage using a few sites—adjusting as necessary. Site readiness factors included:

  • An influential, enthusiastic site sponsor. This was important because a business person (vs. IT) can more readily see the value in having data at their fingertips.
  • Site functions that understand the importance of data.
  • Site S95 Architecture components in place.
  • Batch context in process historians or provided from other readily available sources. You need to know what batches ran when and how.
  • Central IT group project funding and coordination. This helps the business side welcome the solution.

The company’s central IT organization provided training, consulting and strategic planning, continuing to work with users to discover possibilities of the system and extract value from the investment. Eventually, it would like to move ownership of the system from IT to the business and use even more of the systems capabilities for process understanding.

Summary

Replacing an existing business process for managing data is a tough journey, but the destination is worth it. It doesn’t happen overnight, but the long term benefits of having data in context at your fingertips is enormous—although difficult to quantify in concrete financial terms. It sometimes comes down to the one or two batches that you save that wouldn’t have been saved otherwise.

When you start working with your data, you can sometimes find out that your data is not as properly organized as you thought it was. A side benefit this manufacturer saw was uncovering data or systematic issues, giving people a higher level of consciousness about data integrity, and offering the opportunity to find and correct issues before they become audit findings.

Finally, organizational change is much harder than the technology implementation. Change is seen as difficult by Process Automation teams because their natural way of thinking comes from their experiences with changes to validated systems, the paperwork for which can take months. Moving users to a new system and a new paradigm requires gaining their trust and buy-in that the payoff is worth the time commitment. Only then will they see the benefits and use the new system everyday for their investigations, bottom line process improvements and GMP reporting.

About the Author

Justin Neway, Ph.D., is executive vice president and chief science officer at Aegis Analytical Corporation. He has more than 25 years of experience in pharmaceutical and biotechnology manufacturing, and in the development and application of software solutions to quality compliance and operational efficiency in life science manufacturing.

About the Author

Justin O. Neway | PhD