Prediction Machines: The Simple Economics of Artificial Intelligence, a new book from Harvard Business Review, makes an interesting argument. Three economists from Toronto’s Rotman School of Management cut through current machine intelligence and automation hype to identify the surprisingly simple economic imperative behind the meteoric rise of advanced analytics and machine learning technologies we now commonly classify as “AI.” Intelligent systems yield a drop in the cost of prediction, and therein lies their transformative power.
The manufacturing sector is no stranger to the glories of automation, but predictive technologies bring a higher order of value to the table. By collecting and performing computation on pertinent data, intelligent systems reduce operational uncertainty. This enhances efficiency and productivity, and also creates opportunity for new business strategies and cost reductions. So it’s no wonder that the global manufacturing analytics market is expected to rise with a CAGR of 22 percent through 2024 driven by the evolving Industrial Internet of Things (IIoT), widespread adoption of advanced data management strategies, and continual demand for process optimization. One 2018 Digital Supply Chain Executive Survey found that 50 percent of manufacturers see agility and innovation as major concerns, and cite real-time product visibility as a top driver for investment (with 40 percent driven by the need to innovate faster, and 33 percent motivated by lower cost to serve through improved planning).
While some forms of traditional and/or cloud-based analytics have allowed manufacturing firms to better grasp and manage their existing processes, predictive technologies reliably forecast future performance and enable firms to preemptively address those forecasts using live data. For example, the recent TDWI Navigator report highlights one of the most compelling advantages of prediction for manufacturing — the ability to monitor machinery and use past data to predict or prevent future breakdowns. The popular IIoT predictive maintenance use case involves sensors tracking operational parameters and machine conditions within high-value equipment. This supplies the ability to predict when a part failure might occur and can be used to improve production uptime: “A manufacturer might use sensor data from trucks to determine whether and when maintenance is needed…a moving average of temperature from specific parts or a predictive model that was built using historical data of failed parts…The predictive model would be embedded into a system and operationalized to generate alerts or take automated action when new data indicates a problem.”
That “embedding” bit is where things get complicated. In order to operationalize automated action when it is needed, predictive systems must also be responsive within the environment where the data is being generated, often in real time. Standard cloud-managed, batch-oriented analytics cannot serve this purpose. The time and expense required to collect, transport, process, route, and react to any information gleaned from the enormous troves of data generated in IIoT sensor networks negates any value that might be captured if you rely on the cloud alone — it does no good to have “AI” that sends an alert to an operator warning of impending equipment failure two hours after the engine has seized up. To facilitate real-time decision, action, or insight, the architecture must also support streaming analytics (computation performed on data as it flows through systems) as well as traditional cloud analytics, blending both real-time and historical analytics technology. The models that explore past occurrence and use that information to optimize function may be cloud-powered, but the predictive edge analytics frameworks that consume streams of sensor or device data as they are ingested supply the necessary real-time actionable insight as events occur.
Increasingly compact rules-engines and pattern libraries facilitate this type of intelligence directly on devices within manufacturing environments, allowing systems to pose complex questions and return instant results and/or take immediate action.
Incorporating computational intelligence at the “edge” of operations (where the network meets the real world) extends predictive capability to the point where data-based insight is leveraged as tangible action when and where it is needed. Such systems anticipate and prevent failure and ensure optimum value. Regardless of whether it is applied to supply chain, asset maintenance and management, product development automation, or planning — this is the mechanism by which the cost of prediction is lowered and “artificial intelligence” proves its worth.
The economics are simple, but the implementation isn’t. You cannot simply order up an “AI” system and automatically become a “smart” factory with predictive capability. All of these technologies are rules-based at heart. Defining those rules takes human effort. In order for manufacturers to benefit from predictive technology, they must be able to clearly identify and specify what it is they need to predict, then build a customized solution from there. The process by which you reach clarity in specifying those goals and objectives will be distinct to each organization, but one thing is certain. The business of the future isn’t just going to need state-of-the art engines, pumps, processes, and systems to succeed. They’re going to need state-of-the-art assets and the predictive intelligence to maintain, service, and optimize them.
John Crupi is Vice President of IoT Analytics for Greenwave Systems.