Create a free Manufacturing.net account to continue

Streamlining the Innovation Lifecycle, Part 2

Businesses that seek to improve manufacturing and operational excellence by optimizing their investments in scientific innovation must find ways to manage it more efficiently and consistently.

This is part two of a two-part piece. Part one can be found here.

By KEN RAPP, Managing Director, Analytical, Development, Quality and Manufacturing, Accelrys

The Importance of Consistent, Standard Processes

Data capture and information accessibility is one part of the equation. A consistent, reliable and highly optimized process strategy is the other. The processes that guide things like experimental execution or sample testing are complex and often involve multiple steps that need to be continually measured and validated. Organizations want to be sure that the processes they deploy are efficient and accurate, and that problems can be spotted and addressed quickly — before a defective sample batch shuts down a production line, for instance.

The principle of Quality by Design (QbD) depends on this kind of “at line” level of control. If organizations can identify stability, efficacy or safety issues early in the innovation lifecycle, the costs incurred by delays and rework can be minimized or even eliminated. For example, if critical data related to product quality is monitored in as close to real-time as possible, as well as linked to the manufacturing process parameters, it can be used to direct immediate changes in manufacturing conditions to assure improved quality conformance. This is why process studies conducted during development are critical to operational excellence during commercial operations.

To improve process quality, more and more companies across industries are starting to implement process standards guiding procedure development and execution across the innovation lifecycle. What does this mean? It means that organizations test their procedures before executing them to make sure that they are valid, rugged and reliable. Once validated, they are capturing processes as best practices in an effort to reduce the variable outcomes that result from process deployment. And they are beginning to leverage informatics technologies to automate workflows and take as many manual steps out of process execution as possible.

Consider the experience of a large pharmaceutical company who identified a bottleneck where they needed to do high volume sample testing of active pharmaceutical ingredients and drug products, as well as to determine compliance with specifications. A small dedicated workforce was responsible for running approximately 20 different techniques and capturing the results in paper lab notebooks. Testing output was subject to manual data entry and transcription errors, so each step in the process required constant “double checking,” slowing cycle time and taxing limited lab resources. Furthermore, results were often variable due to inconsistent test execution.

To address these challenges and help speed the cycle times involved in moving drug products out of development, the company deployed a lab execution system that enables it to define, validate and control testing procedures electronically. With the system in place, lab managers can automate routine workflows that were previously handled manually and quickly identify potential issues “at line,” rather than later downstream.

The system additionally captures testing results directly from lab equipment and instruments, eliminating the need for manual transcription and data entry. Greater process automation and broader data integration means fewer errors, which reduces rework and compliance risk while increasing process efficiency. Since deploying the solution, the company’s quality control laboratories have recorded an approximate 20 percent gain in productivity in comparison to using a paper-based system, with up to 50 percent improvement for some individual procedures.

This example demonstrates how the creation of high quality, consistent process standards — and the deployment of technology to support enterprise-wide process execution — can help organizations close product lifecycle productivity gaps that are largely avoidable with better control. Beyond productivity improvements, capturing procedure workflows and related data also has another benefit — it empowers organizations to continuously improve their process standards.

Consider the insight that could be gained by applying modeling and simulation technology to the last six months or year of process data spanning research, development and manufacturing. Over time, organizations will be able to identify patterns that lead to both desirable and undesirable outcomes, and apply algorithms that will enable them to predict what might happen if a new variable is introduced to a process.

For instance: How will the production line be impacted if we need to replace X ingredient with Y ingredient? How does speeding up or slowing down a specific processing step impact sample quality? What will happen if we reduce testing lab resources by 25 percent? As the saying goes, “you can’t manage what you don’t measure.” Better, more integrated data management enables better measurement and analysis, and better measurement and analysis leads to better process quality. It’s all connected.

Businesses that seek to improve manufacturing and operational excellence by optimizing their investments in scientific innovation must find ways to manage it more efficiently and consistently. This will require making fundamental changes to how people, processes, technology and data are aligned across the innovation and product lifecycles. True productivity and quality gains will only be realized when data and processes can move bi-directionally across both upstream and downstream activities — from early research to manufacturing — as seamlessly as possible. Attention to more holistic informatics and consistent process execution today will lead to faster, more profitable and repeatable innovation tomorrow.

 

 

More