Plant connectivity brings about the ability to drive operational intelligence and improve food and beverage manufacturing collaboration. Connectivity isn’t just about end-point connections. It’s also about connecting applications and ultimately connecting people. It integrates supervisory control and data acquisition (SCADA) software and historian software and helps aggregate, merge, and analyze big data for the purposes of creating a big return on investment (ROI). In fact, global big data technology and services revenue will grow to $23.76B in 2016 according to IDC. Connectivity will not only eliminate silos of automation in the food and beverage industry but will enable the retrieval and use of valuable data in meaningful ways.
Connectivity is easier to achieve and a lot less expensive to do than before — due to standardization of protocols and methodical procedures. A big differentiator is wireless connectivity, which enables geo-spread and had historically been impossible to do. Changing physical, slow networks to accommodate sensors was too expensive to undertake. Recently it would seem that somethings has changed. The end-point is no longer single-value sensors. They can be more complex and provide their own data intelligence.
HMI/SCADA and historian technology connects to an array of data sources and visualizes that data for monitoring and control and then can create histories of it for analysis. Connectivity enables data sources from a PLC (programmable logic controller) in a manufacturing environment to an OPC (Open Platform Communications) server in a data center to work together. Data is visualized by the HMI/SCADA software in real time to help with immediate decision-making, tie in to fault-state alarming, and/or provide points for trending. That same data must also be recorded and then analyzed using a scalable, robust plant data historian like AspenTech or OSI. In order to fully realize the value of plant-wide connectivity, big data has a huge role.
In today’s connected plant, the ability to deploy on-premises or in the cloud and quickly index multiple data historians and leverage business/operational intelligence is ideal (e.g. OSISoft PI, Yokogawa Exaquantum, and Aspentech IP.21). Sophisticated analytics solutions can simply layer on top of modern trending, analysis, and collaboration tools to identify previously unknown business and operational connections.
From classic distributed control systems (DCS) to Industry 4.0 automation, the standardization of protocols enable the possibilities of creating more open systems. For example, OPC Unified Architecture (OPC UA) is platform- and firewall-independent. It is the industrial M2M communication protocol for interoperability developed by the OPC Foundation and successor to OPC.
By combining multiple modules/applications from different vendors, customers are moving faster than by sticking to one single platform. It also enables decentralized intelligence; analyzing and using data doesn’t have to be in one specific location if systems can talk to each other. Throw APIs (application programming interfaces) on top and you can run some applications on-premises or in the cloud, without restricting to either one. The challenge is cybersecurity; does creating open systems bring extra risks? Simple solutions like single flow data diodes can be used here.
Process performance improves with the implementation of operational intelligence. Such intelligence can leverage the use of search capabilities. For instance, structured time series process data combined with operator data and expertise enable users the ability to predict more precisely what is occurring or what likely will occur in the future within food and beverage manufacturing processes.
Plant connectivity extends the integration of business processes to factory floor applications, machine-to-machine interoperability and helps drive analytics into operations for cost control, improved quality, asset optimization and increased production.
Integrating in-depth knowledge of both process operations and data analytics techniques can minimize the need for complex and engineering-intensive data modeling. Human intelligence can turn into machine intelligence and deliver value from the operational data already collected.
Predictive analytics can provide users with valuable insights about what will happen in the future based on historical data, both structured and unstructured.
Better approaches today come in as an on-premises, packaged virtual server deployment that can easily integrate to the local copy of the plant historian database archives and evolve over time towards scalable architecture to blend in with available enterprise distributed computing platforms. The newer approach uses “pattern search-based discovery and predictive-style process analytics” targeting the average user. They are relatively easy to deploy and use, providing the potential for organizations to gain immediate value, without a big data or modeling solution and no data expert required.
Today there is now a kind of “google search for the process industries.” Operator shift logs become searchable in the context of historian data and process information. In a time when the process industries may face as much as a 30 percent decline in the skilled workforce through retiring workers, knowledge capture is a key imperative for many industrial organizations.
Connectivity and People
Historically, there were data-islands and the organizations surrounding them were silos. Today that is not the case. All systems are connected, which opens up opportunities to collaborate across these organizational silos. There will be a need to retrain or hire people in order to successfully adapt to a connected ecosystem. Due to these new open systems and collaboration, it will change the way people work, comprising a combination of best of breed solutions and continuous innovation.
Change in technology is easy but a cultural change in the way one works will be the challenge ahead. Changing an organization’s culture is one of the most difficult leadership obstacles. This is because an organization’s culture encompasses an interlocking set of goals, roles, processes, values, communications practices, attitudes and assumptions.
Leveraging Data in the Connected Plant
The advancements in connectivity are creating a competitive advantage for manufacturers and other industrial organizations. Operators now can have a different set of tools available to them to improve plant availability and asset effectiveness. There is an immediate need to be able to search time-series data and analyze these data in context with the annotations made by both engineers and operators to be able to make faster, higher quality process decisions. If users want to predict process degradation or an asset or equipment failure, they need to look beyond time series and historian data tools and be able to search, learn by experimentation, and detect patterns in the vast pool of data that already exists in a plant.
Process historians have been useful to store process data and connect to real-time systems. However, such process analytical tools that analyze data in Microsoft Excel can be too time-consuming and limited in functionality. The tools used to visualize and interpret process data are typically trending applications, reports and dashboards. These have been helpful, but typically are not suited for advanced diagnostics or predicting outcomes.
Today's manufacturers are cost-constrained, must deal with erosion of knowledge and talent, and seek cost-effective ways to get value out of the data already generated at the plant. ARC Advisory Group believes the next generation of analytics systems such as those offering 'google-like' search capabilities will be welcomed by process industry end users. It will enable manufacturer’s greater insights into streamlining production, reducing time-to-market and increasing product quality, will be an essential way to stay competitive. A.T. Kearney forecasts global spending on Big Data hardware, software and services will grow at a CAGR of 30% through 2018, reaching a total market size of $114B. With more and more plants connecting all of the systems, assets and deploying IoT initiatives, manufacturers are realizing they must move away from traditional tactics to analytics and data-driven strategies that deliver measurable results.
TrendMiner, with U.S. headquarters in Houston, Texas, delivers Discovery, Diagnostic and Predictive analytics with real time monitoring tools for the process industry. Its flagship software is based on a high performance analytics engine for process data. Through an intuitive web-based trend client, process engineers and operators can easily search for trends using TrendMiner’s patent pending pattern recognition and machine learning big data technologies, without the support of a data scientist. The TrendMiner plug and play solution adds value immediately after deployment eliminating expensive investments in big data infrastructure or long implementation projects. TrendMiner, founded in 2008, is a software company with global headquarters in Belgium and offices in the Netherlands, Germany, Spain and the U.S. Visit us at: www.trendminer.com