Putting Global Manufacturing Data at Your Fingertips
Implementing a new IT paradigm for on-demand data is as much an organizational issue as it is a technological one.
By Justin O. Neway, PhD, Aegis Analytical Corp.
Knowledge management is a broad term, but can be thought of as leveraging information within an organization so that it becomes an asset. Knowledge is embodied information, which in turn is derived from data that has been acted on in context over time to produce results that provide learning.
For the purposes of this article, I refer to manufacturing data as one of the most important sources from which knowledge can be developed. I will focus on how “data-derived knowledge” can be developed and used to achieve manufacturing process excellence across a global manufacturing network using an on-demand process data access and analytics platform.
This article is based on a presentation I gave in June 2008 at the Seventh Annual Biological Production Forum in Munich, Germany. The talk examined the role of on-demand data access, aggregation and analysis in achieving process excellence and how a top international pharmaceutical company is rolling out this type of manufacturing system on a global level. The rollout has not been without its challenges, but in the end the platform has provided the manufacturer with the shared, global system of on-demand data it sought, and with several side benefits.
The focus here is on the business case and recommended steps for having IT work closely with manufacturing users. The recent IT paradigm shift and the change management and user retooling processes required are specifically examined—providing valuable insight for all process manufacturers considering implementations of enterprise-class on-demand data access and analytics systems.
Competitive and regulatory drivers for change
The “Desired State” of manufacturing has been articulated for a number of years as the condition where quality has been designed into the manufacturing process based on process understanding, so that continuous improvement. FDA has shown that it is willing to exercise risk-based regulatory scrutiny, based upon the manufacturer’s level of scientific understanding of quality and performance and its process control strategies to prevent or mitigate product risk.
In practice for manufacturers, this translates to a faster passage of your CMC section through reviews if you have sound scientific data underpinnings, and provides for fewer and less intensive inspections. FDA is very interested in this, because of its own limited resources for conducting inspections. The Agency’s new technology and regulatory framework expands future choices for manufacturers and allows them to innovate, while still minimizing risks to consumers.
There are revenue advantages and cost reduction opportunities associated with moving in the direction FDA wants us to—meaning building quality into the process or Quality by Design (QbD). Process analytical technology (PAT) is an enabling technology to achieve QbD, it is not an end in itself. You may not, in fact, even need real-time, online measurement to achieve QbD in some processes. So the obsession with having a particular online measurement when there is no real scientific basis for it is not a cost effective way of approaching QbD.
The strategic business goals associated with QbD lead to revenue enhancement and cost reduction. Specifically, the outcomes are:
- Faster regulatory approval of new applications and process changes
- Faster technology transfer between development and manufacturing
- Shorter time to revenue to meet demand after start-up
- Greater flexibility to lower cost of manufacturing changes
- Increased predictability of manufacturing output and quality
- Reduced batch failures, final testing and batch-release costs
- Lower operating costs from fewer failures and deviations
- Reduced inventory costs from raw material, WIP and FP
- Fewer, shorter regulatory inspections from “regulatory flexibility”
All of these result in favorable bottom-line impacts for business. So, tactically speaking, how do we achieve these goals?
1. Understand sources of variability in the process to improve control. Increase the predictability of product quality and yield by understanding the sources of variability and building a better model for controlling the process.
2. Enable investigational analysis across all types of data. Investigational analysis drives the understanding of cause-and-effect relationships vs. simply looking at the descriptive analysis that summarizes things with “dashboard-like” information that—while useful—does not necessarily tell you about cause and effect.
3. Support the collaboration between Process Development and Manufacturing. We sometimes forget that one of the most important sources of expertise of today’s processes are the Process Development efforts that led to those processes being launched into manufacturing to start with. Being able to collaborate helps move knowledge from and into Process Development and also helps Process Development teams understand the constraints of Manufacturing. This leads to faster access to prior experience and enables process understanding earlier in the process development cycle.
4. Provide process visibility across geographic boundaries. It’s a reality that processes often start in one part of the world and end in another. Upstream processing affects downstream outcomes, so we need to be able to do analysis that helps us understand that. With easy access to data from global processes, we can more easily do the cause-and-effect investigations that proactively lower failure rates.