Fog Computing and Business Continuity

In Big Data by Daniel NewmanLeave a Comment

Fog Computing and Business ContinuityData storage and processing are moving to the cloud, which means faster, more efficient systems. The ever-growing Internet of Things (IoT) pushes processing even further from center, out to collection points, in a process known as fog computing. Though this will advance analytics and reduce latency, it reincorporates hardware into the equation. Glitches and natural disasters can destroy devices or cause malfunctions—necessitating a hybrid solution for businesses.

Outsource to the Edge

Quickly evolving business landscapes require elasticity and agility in people, ideas, and in technology systems. The cloud emerged as a perfect complementary solution to existing systems, permitting that elasticity and flexibility. Extremely scalable, cost-effective, and secure, the cloud has increased connectivity, processing capabilities, and storage options. In turn, it paved the way for the rapid growth of the IoT.

The cloud’s increased processing and storage capabilities created the necessary infrastructure for smart devices to shine, leading to a major increase in the IoT sphere. Experts estimate that businesses will invest $6 trillion into IoT solutions between 2015 and 2020. Though greater than $1 trillion a year, it seems to me that this massive expenditure is justified. Current projections are that by 2020, 24 billion IoT devices will be active.

Today, the expanding IoT demands faster, more efficient solutions than even the cloud provides. The IoT is producing more data than the cloud can reasonably transfer and process efficiently. Estimates say that the IoT generates 2.5 quintillion bytes of data daily. Information even a few days old can lose its value.

And this is where fog computing–also called edge computing–comes in. Fog computing relies on individual nodes to be self-sufficient. By eliminating the need for constant management, these outlying processes occur quickly. By thinning data at the collection site, important information transfers quickly, reducing latency issues and streamlining processes.

Update Your Disaster Recovery Plans for Edge Computing

Cloud computing provides an easy path to disaster recovery. By storing data outside of racks, businesses can recover from disruptions due to localized natural disasters and power fluctuations much more quickly and efficiently. Although it’s still a problem, a flood taking out a company’s basement no longer has to be a crippling problem.

Fog computing places processing closer to the collection site in the associated devices. This is necessary, as the cloud is reaching the ceiling of what it can accomplish from afar. Businesses may be concerned, however, about relying on nodes so far from the centralized systems. A major incident could wipe out each miniature processor in a city, resulting in huge data losses. Though the benefits of edge computing are vast, device use and management brings with it a new set of issues.

Local and remotely hosted data back-ups remain just as critical with edge computing, as does making sure the IT department can access those back-ups in the event of an emergency.

Embrace a Cloud-Fog Hybrid

Businesses are always looking for simple, across-the-board solutions. However, in the current technological climate, that is not possible for many businesses. Hybrid, multilayered approaches to data storage and processing far outshine singular solutions to maintain productivity and to preserve business continuity in the face of a data emergency.

With IoT growing and edge computing on the horizon, we are seeing the emergence of a new set of hybrid solutions. Fog computing is valuable and should be leveraged to increase data flow and efficiency. However, businesses must be aware of its limits and dangers. Each individual business model should tailor a combination of cloud, fog, and on-site options to fit. A hybrid model best addresses the struggle to maintain efficiency while mitigating risks.

The cloud is still an ideal disaster recovery method. By removing any reliance on hardware, it raises information above disaster potential. By employing data-thinning techniques, outlying nodes can accomplish fast-paced tasks and relay vital information back to centralized cloud systems for safe storage. In this way, the edge and cloud work together seamlessly. By managing data at the node and only retaining important information, data storage and analysis no longer inundate central servers. This solution, therefore, has added cost benefits.

A combination of on-site, off-site, and cloud storage remains the safest bet for disaster recovery strategies. This does not negate the value of fog computing. I encourage companies to be realistic about the strengths and weaknesses of every platform and data solution. By using multilayered thinking, companies are more likely to avoid trouble down the road.

Use Multilayered Solutions 

Last November, I discussed the advantages of pursuing a hybrid cloud—both public and private—but that was only the first layer of the solution. Edge computing is a step toward a new kind of hybrid–one that incorporates cloud and device. Relying on the cloud doesn’t have to mean throwing out devices that streamline your business, but it does mean deploying more sophisticated, multilayered solutions. Recognizing the benefits and weaknesses in both the cloud and the edge is vital for achieving maximum benefits in both regular operations and disaster recovery.

This post was brought to you by IBM Global Technology Services. For more content like this, visit Point B and Beyond.

Photo Credit: ICE Solution via Compfight cc

Daniel Newman is the Principal Analyst of Futurum Research and the CEO of Broadsuite Media Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise. From Big Data to IoT to Cloud Computing, Newman makes the connections between business, people and tech that are required for companies to benefit most from their technology projects, which leads to his ideas regularly being cited in CIO.Com, CIO Review and hundreds of other sites across the world. A 5x Best Selling Author including his most recent “Building Dragons: Digital Transformation in the Experience Economy,” Daniel is also a Forbes, Entrepreneur and Huffington Post Contributor. MBA and Graduate Adjunct Professor, Daniel Newman is a Chicago Native and his speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

Leave a Comment