Fog computing extends the concept of cloud computing to the network edge, making it ideal for internet of things (IoT) and other applications that require real-time interactions. Credit: Thinkstock Fog computing is the concept of a network fabric that stretches from the outer edges of where data is created to where it will eventually be stored, whether that’s in the cloud or in a customer’s data center. Fog is another layer of a distributed network environment and is closely associated with cloud computing and the internet of things (IoT). Public infrastructure as a service (IaaS) cloud vendors can be thought of as a high-level, global endpoint for data; the edge of the network is where data from IoT devices is created. Fog computing is the idea of a distributed network that connects these two environments. “Fog provides the missing link for what data needs to be pushed to the cloud, and what can be analyzed locally, at the edge,” explains Mung Chiang, dean of Purdue University’s College of Engineering and one of the nation’s top researchers on fog and edge computing. + MORE AT NETWORK WORLD: What is edge computing and how it will change the network + According to the OpenFog Consortium, a group of vendors and research organizations advocating for the advancement of standards in this technology, fog computing is “a system-level horizontal architecture that distributes resources and services of computing, storage, control and networking anywhere along the continuum from Cloud to Things.” Benefits of fog computing Fundamentally, the development of fog computing frameworks gives organizations more choices for processing data wherever it is most appropriate to do so. For some applications, data may need to be processed as quickly as possible – for example, in a manufacturing use case where connected machines need to be able to respond to an incident as soon as possible. Fog computing can create low-latency network connections between devices and analytics endpoints. This architecture in turn reduces the amount of bandwidth needed compared to if that data had to be sent all the way back to a data center or cloud for processing. It can also be used in scenarios where there is no bandwidth connection to send data, so it must be processed close to where it is created. As an added benefit, users can place security features in a fog network, from segmented network traffic to virtual firewalls to protect it. Applications of fog computing Fog computing is the nascent stages of being rolled out in formal deployments, but there are a variety of use cases that have been identified as potential ideal scenarios for fog computing. Connected Cars: The advent of semi-autonomous and self-driving cars will only increase the already large amount of data vehicles create. Having cars operate independently requires a capability to locally analyze certain data in real-time, such as surroundings, driving conditions and directions. Other data may need to be sent back to a manufacturer to help improve vehicle maintenance or track vehicle usage. A fog computing environment would enable communications for all of these data sources both at the edge (in the car), and to its end point (the manufacturer). Smart cities and smart grids Like connected cars, utility systems are increasingly using real-time data to more efficiently run systems. Sometimes this data is in remote areas, so processing close to where its created is essential. Other times the data needs to be aggregated from a large number of sensors. Fog computing architectures could be devised to solve both of these issues. Real-time analytics A host of use cases call for real-time analytics. From manufacturing systems that need to be able to react to events as they happen, to financial institutions that use real-time data to inform trading decisions or monitor for fraud. Fog computing deployments can help facilitate the transfer of data between where its created and a variety of places where it needs to go. Fog computing and 5G mobile computing Some experts believe the expected roll out of 5G mobile connections in 2018 and beyond could create more opportunity for fog computing. “5G technology in some cases requires very dense antenna deployments,” explains Andrew Duggan, senior vice president of technology planning and network architecture at CenturyLink. In some circumstances antennas need to be less than 20 kilometers from one another. In a use case like this, a fog computing architecture could be created among these stations that includes a centralized controller that manages applications running on this 5G network, and handles connections to back-end data centers or clouds. How does fog computing work? A fog computing fabric can have a variety of components and functions. It could include fog computing gateways that accept data IoT devices have collected. It could include a variety of wired and wireless granular collection endpoints, including ruggedized routers and switching equipment. Other aspects could include customer premise equipment (CPE) and gateways to access edge nodes. Higher up the stack fog computing architectures would also touch core networks and routers and eventually global cloud services and servers. [ Related: What is edge computing and how it’s changing the network ] The OpenFog Consortium, the group developing reference architectures, has outlined three goals for developing a fog framework. Fog environments should be horizontally scalable, meaning it will support multiple industry vertical use cases; be able to work across the cloud to things continuum; and be a system-level technology, that extends from things, over network edges, through to the cloud and across various network protocols. (See video below for more on fog computing from the OpenFog Consortium.) Are fog computing and edge computing the same thing? Helder Antunes, senior director of corporate strategic innovation at Cisco and a member of the OpenFog Consortium, says that edge computing is a component, or a subset of fog computing. Think of fog computing as the way data is processed from where it is created to where it will be stored. Edge computing refers just to data being processed close to where it is created. Fog computing encapsulates not just that edge processing, but also the network connections needed to bring that data from the edge to its end point. More on how fog computing works Related content news Cisco patches actively exploited zero-day flaw in Nexus switches The moderate-severity vulnerability has been observed being exploited in the wild by Chinese APT Velvet Ant. By Lucian Constantin Jul 02, 2024 1 min Network Switches Network Security news Nokia to buy optical networker Infinera for $2.3 billion Customers struggling with managing systems able to handle the scale and power needs of soaring generative AI and cloud operations is fueling the deal. By Evan Schuman Jul 02, 2024 4 mins Mergers and Acquisitions Networking news French antitrust charges threaten Nvidia amid AI chip market surge Enforcement of charges could significantly impact global AI markets and customers, prompting operational changes. By Prasanth Aby Thomas Jul 02, 2024 3 mins Technology Industry GPUs Cloud Computing news Lenovo adds new AI solutions, expands Neptune cooling range to enable heat reuse Lenovo’s updated liquid cooling addresses the heat generated by data centers running AI workloads, while new services help enterprises get started with AI. By Lynn Greiner Jul 02, 2024 4 mins Cooling Systems Generative AI Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe