Even as I was writing this book, new technologies emerged — and they will continue to do so. It also quickly became apparent that I could never include all of them and still finish the book. Instead, I decided to highlight a few that I consider the most important and far enough along to write about. What comes after, you’ll have to discover on your own. That shouldn’t be too hard. Just stay involved with your industry association and/or check out industry and IoT conferences and trade shows once a year or so.
You’ve read my prior references to fog computing. Specifically, fog computing creates a platform — comprised of what we call a fog node — that provides a layer of compute, storage, control and networking services, and event stream processing between end devices on the ground and in cloud computing datacenters. Fog isn’t a separate standalone architecture; instead, it extends and scales the existing cloud architecture all the way to the edge of the network, as close to the source of the data as possible.
The purpose is to enable real-time data processing and analytics of either large amounts of data or data in motion. The objective of fog computing isn’t connecting devices differently. Rather, it’s analyzing the data from the devices faster, with less latency and more efficiency. In effect, with fog computing we’re putting data processing closer to the devices that generate or collect that data (Figure 10.3), and then analyzing it right there in real time.
A few years ago, Flavio Bonomi — founder and CEO of Nebbiolo Technologies, which focuses on the application of IoT technologies in industrial automation — led the definition (and naming) of fog computing with his team. When I asked him about fog, he summarized it well: “As we started to work on projects such connected vehicles, smart grids, and smart cities, we identified a common set of requirements for compact, scalable, well-managed, secured, and integrated networking, computing, and storage resources between the endpoints on the ‘ground’ and in more distant clouds. The term ‘fog computing’ was, in fact, naturally motivated by this need to bring more cloud-like capabilities ‘closer to the ground.’ In time, it became clear that fog computing actually facilitated the convergence of OT and IT and enabled new IoT use cases that required real-time capabilities, deterministic performance, physical security, and safety. Since it inherits elements from both IT and OT, fog computing naturally mediates between both domains at the various levels of the stack, from networking to security to the data level to the application level.”
So what’s the big deal about fog computing? At first glance, it doesn’t appear to be all that different. In truth, however, it amounts to a distinct innovation. Fog computing (Figure 10.4) brings analytics and processing to the data. That’s the difference, and it’s a big difference. In the past, we always brought the data to where the processing occurred. That generally meant sending information to some distant central datacenter, which added cost and significant delays. Now, with fog computing, we can scale the cloud and make it viable for real-time use-cases — the cloud and the edge can work together as an integrated system. Cloud software can send a policy to the fog node, requesting only certain types of data or only the exceptions to, say, a temperature threshold. The data is processed in the fog node based on this policy and only these exceptions, and the specific data requested is sent back to the cloud. The rest of the data is either stored locally in the fog or discarded.
As a result, we can convert the raw data collected from connected devices into useful information that can be acted on immediately — often in real or near-real time. When fog computing removes the latency from an IoT transaction, things can happen that fast. From there, we can also convert that information into valuable business insights through new applications, including real-time analytics and predictive context.
In short, fog computing brings:
- Near-real-time or real-time processing and analytics capabilities to the edge of the cloud
- Processing and analytics closer to the data and where they are used
- Much faster and more efficient analytics via a policy-based edge-to-cloud-to-edge system
Consider that the first stage of the Internet focused mainly on batch processing, wasn’t time sensitive, and didn’t use machines that consumed a lot of bandwidth. Now consider that even a single automobile can generate a huge amount of data and requires serious bandwidth — especially because that data is more time-sensitive and, therefore, even more important. (As an example, ask yourself how long you have to react if your car starts to overheat.) Enter fog computing, which solves some of today’s most common challenges, including:
- High latency on the network
- End-point mobility
- Loss of connectivity
- High bandwidth costs
- Unpredictable bandwidth bottlenecks
- Broad geographic distribution of systems and clients
As we’ve discussed throughout this book, fog computing is a key enabler of IoT, and it’s driving an array of new use-cases in every area of life and industry — from retail to healthcare to oil and gas exploration and production. Preventive vehicle maintenance is one example. The sensors in each new connected vehicle generate up to two petabytes of data each year. It would be impractical and prohibitively expensive to send all of this raw data over the mobile network to the cloud for real-time processing.
Fog computing turns these vehicles into mobile datacenters that can sort and index the data in real time and send alerts when action is required — for example, checking an overheated engine or filling an underinflated tire.
The industry has recognized the transformational capability of fog computing to enable a new wave of use-cases that weren’t possible with cloud-centric implementations — hence the November 2015 creation of the OpenFog Consortium.
“We formed OFC to accelerate the adoption of fog computing to solve pressing bandwidth, latency, and communications challenges associated with IoT, artificial intelligence, robotics, and other advanced concepts in the digitized world,” OFC Chairman Helder Antunes told me. “Our technical workgroups are creating an OpenFog architecture that enables end-user clients or near-user edge devices to carry out computation, communication, control, and storage. And we plan to accomplish these goals in a collaborative manner, where interoperability between technology vendors is also ensured.”
Excerpted from BUILDING THE INTERNET OF THINGS: Implement New Business Models, Disrupt Competitors, Transform Your Industry by Maciej Kranz. Copyright © 2016, Wiley.