david_linthicum
Contributor

When to use a fog computing node

analysis
Jul 21, 20203 mins
Cloud Computing

Fog nodes are the new ‘cool kid’ components for edge computing. However, we risk overusing them if we don’t understand their true purpose at the edge

Think of a fog computing node as a physical server that resides between the edge devices (thermostats, robots, in-vehicle computers) and the back-end systems, typically hosted on public clouds. The fog nodes respond to an architectural problem: too much latency to pass requests all the way back to the public cloud-based services and not enough horsepower to process the data in the edge device itself.

This three-tier system adds another compute platform that is able to do some—if not most—of the back-end processing. This addresses the concern that cheaper edge devices with lower power don’t have the processing and storage power to deal with the incoming data natively. Now data can be sent to the fog node for processing, without the latency impact of going all the way back to the remote cloud services.

Although fog nodes are a simple solution to a simple issue, you should understand when and when not to use them:

You should use fog nodes when data that is complex or in large amounts need to be processed locally and would overwhelm the edge-based device that is consuming the data as well. In other words, you need something that responds in almost real time, such as a factory robot shutting down when the servers overheat. You want that to be instantaneous.

You should use fog nodes when a human is waiting for the response to return back to the edge device. Like the need to respond to events that can’t wait, humans need to be considered near-real-time architectural components. People are too expensive to be waiting around for responses from the remote cloud systems. 

You should use fog nodes when you’re multiplexing data from several different types of edge devices or sending several different types of data. The fog node manages the processing of data from several edge devices at once, dealing with things such as semantic transformation of the data before sending it to the back-end cloud servers. Or the fog node can process and respond directly to the edge-based device.

Basically, you should not use fog nodes unless the architecture and requirements fit the above criteria. In my book, they should be used sparingly, especially considering that they add cost and operational complexity—and another layer that can fail.

Your best bet for now is to exclude them, and then figure out if you have situations where they can be of use.

david_linthicum
Contributor

David S. Linthicum is an internationally recognized industry expert and thought leader. Dave has authored 13 books on computing, the latest of which is An Insider’s Guide to Cloud Computing. Dave’s industry experience includes tenures as CTO and CEO of several successful software companies, and upper-level management positions in Fortune 100 companies. He keynotes leading technology conferences on cloud computing, SOA, enterprise application integration, and enterprise architecture. Dave writes the Cloud Computing blog for InfoWorld. His views are his own.

More from this author