A great guest post by Maurice Mortell, Managing Director of Ireland and Emerging Markets, Equinix.
We have been following the progress of “fog computing” for a while, and as more enterprises realise its benefits, a variety of fog solutions have come onto the market.
But what is fog computing, or, fogging, as it is sometimes referred? In its latest analysis, “The Fog Rolls In: Network Architectures for IoT and Edge Computing,” Stratecast: Frost & Sullivan provides some clarity about what fog computing is, the applications that need it and the different ways it can be delivered.
The term was coined by Cisco and describes a compute and network framework for supporting Internet of Things (IoT) applications – though it isn’t exclusive to IoT.
Cloud computing has become a fundamental tool for today’s enterprises, with a centralised architecture that allows organisations to store and tap into data as and when they please. However, with the massive flow of time-sensitive data created by many IoT applications, often from far-reaching geographical locations, transmission of data through a remote cloud can create unwelcome delays that negatively impact performance.
The solution is fog computing; its name indicating that compute resources sit close to the ground, or data sources, just like a fog. Workloads are split among local and cloud environments, where different “things” (i.e., sensor-equipped, network-connected devices) quickly transmit data to locally deployed “fog” or “edge” nodes, rather than communicating directly with clouds. A subset of non-time-sensitive data is then forwarded from the fog nodes to a centralised cloud or data centre for further analysis and action.
By placing some analytics functionality close to the source, fog and other edge architectures can reduce the amount of data traversing the network, minimising latency and costs. It also helps to protect against data loss, damage and cyberattacks.
Different Approaches to Fog Computing
Stratecast anticipates that enterprises will leverage industry leaders – such as network and cloud service providers, platform providers, and IT vendors – to provide fog computing tools and services that can be more easily and cost-effectively deployed. Network, system and cloud providers are working together to create fog computing solutions comprised of edge equipment (i.e., network connectivity, processor capacity, security, management and analytics platforms) and software (i.e., management, monitoring, security and analytics software) based on open standards that enable the seamless data sharing and processing between edge devices and the cloud.
Because fog computing is a framework, not an actual product, many enterprises tried to “do it themselves,” only to be faced with a lot of complexity and added resources and cost to get it right. That created a need for managed fog services where a third-party provider deploys and manages fog nodes for enterprises. Stratecast also envisions a market for “three-tiered” fog computing, where customers can choose single-tenant server and storage resources and reduce costs by sharing other resources such as analytics platforms.
At Equinix, we see that many enterprise fog computing challenges are similar to those faced by enterprises when deploying efficient compute and network architectures that may not be exclusive to IoT workloads. This is why we created an Interconnection Oriented Architecture™ (IOA™) to provide a framework for building mesh hybrid IT environments in which people, locations, clouds and data can be interconnected over high-speed, low-latency connections securely, efficiently and cost-effectively. We’re also setting up a “knowledge base” for enterprises based on real-world use cases which contains templates, patterns, architectural blueprints, cookbooks and deployment guides to help enterprises select, build and implement interconnection-centric architectures.