“Fog Computing”: Paradigm Shift or Just Another Buzzword?

Written by Joe Kozlowicz on Tuesday, October 18th 2016 — Categories: Cloud Hosting

fog computingCloud computing has largely hit the mainstream. Your mom knows about it (at least vaguely — she’s probably asking you to help her put her pictures in the cloud). But IT progress continues to march on, and a new model of information processing is beginning to take shape: fog computing.

As smartphones continue to proliferate and smart devices begin to really take off, cloud servers became integral to application service delivery, allowing remote rather than local processing. At the same time, business applications embraced the cloud for its flexibility and to avoid hardware management. It enabled guaranteed uptime and greater connectivity.

So where does fog take over from cloud? When an army of connected devices require constant processing power and connectivity. The Internet of Things is coming fast. According to IDC, the IoT will expand by 2020 to include 4 billion people online, 25 million or more apps, 25 billion embedded and intelligent systems, and 50 trillion gigabytes of data.

Fog computing is a way to manage some of those bandwidth and processing power demands by splitting the duties between the local device and remote data centers. It should sound familiar if you know the hybrid cloud model, which balances onsite virtualization with hosted cloud from cloud providers.

The biggest problem solved by fog computing is bandwidth. There are some applications where even miniscule delays in transferring data via network can have significant negative effects. Any industrial application involving a malfunctioning piece of equipment could fall under this category – in the time it takes for data to transfer to a central cloud, be processed and analyzed, and then return to the origin facility, a batch of product could be ruined.

Other applications store some data locally in case the network connection is spotty, like Uber, which keeps some information on driver’s phones so service is not interrupted.

White Paper

Best Practices for Data Center Migration

This white paper includes a timeline and details to help plan your migration.

Fog computing also makes use of edge networks and edge data center facilities. The primary goal is to place data wherever it makes the most sense at a given moment to maintain uptime and a positive user experience. Some organizations like Cisco have coined the term “fog node” to refer to any device with compute resources, storage, and connectivity that can be used for local processing. It doesn’t have to be a server. Fog nodes also include industrial equipment, network equipment like switches and routers, or even surveillance cameras.

At home, an example might be your laptop downloading updates for your smart home hub, the dongles and attachments for that hub, your wifi-enabled refrigerator, and your Sonos system all at once, then sharing those updates on your local network.

Bandwidth costs can quickly add up when working with a cloud provider and transferring large volumes of data. As more and more of your business involves connected devices, keeping some of that data locally might make sense.

The key concept with fog nodes is temporality. Any data stored locally only lives there for a brief period, almost certainly less than a day, before being sent on to the primary cloud environment for long term storage and processing.

 

We’ll see how fog computing develops (and if the term sticks) as the IoT continues to mature. You might want to start thinking about adding fog processing power to your cloud initiatives if you collect data at far edges of network connectivity, have connected devices ranging in the thousands to millions, or have applications that must analyze data for action in a second or less.

Chat Now