Developing Your Science-Based Manufacturing Strategy

This month marks the two-year anniversary of FDA’s release of its process analytical technology (PAT) guidance for industry. Since that time, life sciences companies have undertaken a multitude of PAT projects, from sensor development and data analysis on specific process units to corporate regulatory strategies and standardized process analytical technology platforms.

This month marks the two-year anniversary of FDA’s release of its process analytical technology (PAT) guidance for industry. Since that time, life sciences companies have undertaken a multitude of PAT projects, from sensor development and data analysis on specific process units to corporate regulatory strategies and standardized process analytical technology platforms.

This breadth of projects reflects the large and sometimes ambiguous scope of PAT. To achieve common ground in defining and discussing PAT, it is useful to refer back to the FDA’s four principles that form the pillars of a PAT strategy:
• Process understanding
• Risk-based approach
• Regulatory strategy to accommodate innovation
• Real-time release

Companies applying these four principles to their manufacturing and development operations to achieve science-based manufacturing are taking the first steps transitioning from quality-by-inspection to real-time quality-by-design. The economic and time-to-market benefits can be enormous, but the undertaking is not trivial. Over a series of four articles, we will explore the concepts, tools and value each principle provides life sciences manufacturers within a comprehensive PAT program.

The First Principle - Process Understanding
With the goal of PAT to understand and control the manufacturing process – achieving quality-by-design – it is not surprising that process understanding projects in many forms dominate the PAT spend. The FDA defines process understanding as:
• All critical sources of variability are identified and explained
• Variability is managed by the process
• Product quality attributes can be accurately and reliably predicted and controlled to be within specifications

To view chart larger, click here.

This simple and clear definition belies the fact that achieving process understanding is not always easy. For example, to determine the state of a process, measurements of physical properties and component information are needed, but are not always available. Whatever information is available on process parameters often is not meaningful until it is reconciled through models incorporating batch, process, analytical and sometimes environmental parameters. The model output may provide insight into the process to close the loop and control the process, but only if the model itself is robust across the data space it encounters. Identifying, managing and controlling variability in order to produce to specification can become a series of questions without clear answers: what should be measured, how should it be measured, how should this often disparate data be analyzed, what tools should be used…and the list goes on.

Despite the challenges, the value of process understanding is clear for companies looking to make the shift from time intensive quality-by-inspection to real-time quality-by-design. Managing and controlling variability is essential for a successful transition. As a result, variability reduction and control is a major focus for companies since pharmaceutical processes generally don’t accommodate raw material and process quality variations. Adapting to inherent raw material and process variation requires changes in operating conditions, a significant paradigm shift for processes historically designed and filled with very tight ranges, or no range whatsoever. In-depth process understanding is the key to using the manufacturing process to aggressively compensate for varying material and environmental conditions and keeping variability in check.

Process Understanding Recaptures Money Lost by Variability
A great example of the path to process understanding and the resulting benefits involves a solid dosage process at a major pharmaceutical manufacturer. Sudden, unacceptable dissolution rate variability exposed a glaring need to identify the true critical process parameters affecting the dissolution rate in order to apply controls to help maintain the process within specifications. The challenge was identifying the critical process parameters across the solid dosage line with well over a hundred process variables, further complicated by the fact that data was available from only a little over two hundred lots.

In order to extract usable information out of this typical – yet challenging – data set, the usual data analysis methods that rely on large data sets simply would not work. Instead, a combination of mathematical modeling and artificial intelligence techniques were required to analyze the data and create a suitable model for predicting product quality. The data – process, batch, analytical, environmental and operator identification – was preprocessed, normalized, and mined for critical process parameters. Several different algorithms were attempted in the search for a robust model until a hybrid approach eventually solved the challenge.

Unexpectedly, the analysis identified three parameters which were contributing to the sudden change in dissolution rate – one process parameter, one raw material parameter, and one environmental parameter – in addition to those widely expected to be critical, such as tablet hardness. Not only did the analysis and resulting models provide a mechanism that could control the process, they identified that simply requiring tighter specifications on raw material could eliminate much of the dissolution variability. By obtaining better process understanding of the solid dosage process, this manufacturer eliminated more than $5 million in wasted product each year.

Process Understanding Across the Formulation-to-Commercialization Continuum
In the previous case, process measurements were already established, and the results showed that variability could be reduced without adding additional sensors to glean more insight into the process state. In some situations, the process state simply cannot be ascertained with the current measurements and models. This type of challenge is typical in development organizations where scientists and engineers must define critical process parameters and measurements for new processes. Silo approaches to process development are entering a paradigm shift. Many companies are increasing collaboration between development and commercial manufacturing early in the path to commercialization, integrating process understanding activities along the formulation-to-commercialization continuum.

Process Understanding Translates to Rapid Scale-up
An example of this has been in place for some time at a major manufacturer of oral and parenteral dosage forms and drug delivery systems. Several years ago, the company’s development department was dealing with typical problems encountered by functional silos. Disparate systems for storing data, limited knowledge transfer mechanisms, and standalone instrumentation meant that much of the process understanding
uncovered along the development path was not readily available or used across different scales.

Some companies addressing this type of challenge, evaluate an additional layer of technology to capture, distill and propagate process analytical data. In this case, using an innovative approach of extending electronic batch records (EBR) with analytical and process data, the manufacturer deployed a standard EBR system to capture process understanding data across its facility from pre-formulation to scale-up.

Data is automatically captured and stored in the EBR from analytical instrumentatio