New Network Architectures: Cloud and Fog Computing on the Edge
As the global volume of internet data increases, relying on the cloud for computing resources poses challenges. Edge and fog computing architectures offer alternatives.
The concept of sharing information technology (IT) resources via a network of linked devices has experienced incredible growth since the early 2000s. Instead of investing in local computing and storage equipment, users obtain these resources via connection to the internet. Cloud computing offers infrastructure and software as a subscription service rather than a purchased asset.
Three types of cloud computing services are in use today:
- Software as a service delivers applications, such as Microsoft’s Office 365 and Google’s Gmail.
- Infrastructure as a service provides virtual computing resources, such as data storage Dropbox, and iCloud.
- Platform as a service is a resource tool for application developers.
Cloud computing services offer several compelling advantages:
- Much-reduced initial investment in IT equipment
- Eliminates costly replacement of obsolete equipment
- Reduced equipment maintenance and related labor costs
- Automatic software updates to the latest revision level
- Stored data security through encryption and redundancy
- Data can be easily shared among others with access to the cloud, with 24/7 data access from anywhere
- Elastic and almost unlimited resources can expand and contract to meet current demand
- Instant access to computing power that varies from basic to supercomputer
- Pay-per-use model allows proportional allocation to using departments
- Immediate access to highly specialized programs for limited time periods
- Reduced local power consumption
The cloud has taken over the computing world by storm. The past few years have seen dramatic expansion of cloud computing services at both the consumer and commercial business levels. Estimates for the total global market value of cloud services in 2019 range from $228 billion to $253 billion, and forecasts predict that value may hit $623 billion by 2023. Now, the COVID-19 pandemic has further accelerated the utilization of cloud services, which may make these forecasts conservative.
Amazon Web Services leads the industry with 33% of the market, followed by Microsoft Azure with 18% and Google Cloud with 8%. Suppliers of cloud computing services have been among the very few beneficiaries of the recent virus pandemic, as millions of homebound residents have turned to the internet for work and play.
Cloud services do, however, come with some concerns and limitations. Certainly, access to programs and storage to and from the cloud is limited by the speed of the local interconnection. Putting all IT resources at a central data center raises concern about security in the event of a major equipment failure or natural disaster at that site. There are also questions about intellectual property issues when data is stored on a server owned by another company.
The most serious concern revolves around the security of data stored in a public cloud. One solution is the rise of the private cloud, which is typically owned and operated by a large company exclusively for the use of its employees. Sensitive data never enters the public internet or third-party servers. Companies may also create hybrid clouds where sensitive data is secured on the private cloud, while ordinary communications are transacted on the public cloud.
The cloud plays a key role in the implementation of the Internet of Things (IoT), which has raised some issues with the centralized structure of cloud computing. With potentially billions of new connected devices trying to communicate through the cloud, remote server architecture may not be the most efficient way to go. Cloud-centric communications requires adequate connectivity, which may not be the case in some mobile applications and rural locations. Clouds also assume that each node will have adequate bandwidth to support the application. This is especially true for factory automation applications that can generate immense volumes of data. Finally, data transmission to and from a remote cloud will experience latency, which may be acceptable in some consumer applications but is unacceptable for real-time, high-speed process control, medical, and machine-to-machine applications. These real-world constraints have a significant impact on the efficiency of a centralized computing model. Several alternatives offer a more decentralized computing infrastructure.
One solution has been a return to the concept of a more distributed computing model. Rather than compel users to compete for computing resources at a large central data center, distributed computing utilizes smaller servers and storage units, which are likely to be much closer to the consumer. Through this trend, cloud computing may evolve into a fog computing architecture.
The advantages of fog computing architecture include reduced latency, improved security, greater reliability, and the ability to provide location awareness. A network model that locates servers closer to the user may be able to address many cloud drawbacks but could add complexity to the system.
Data can be processed and shared directly between devices in fog computing architectures, eliminating a potentially long roundtrip to the cloud. Simple IoT devices such as light bulbs and door locks that do not require extensive computational support can function more efficiently without causing gridlock at higher core or cloud levels. Fog computing can also get us closer to real-time responses, which will be essential to support autonomous transportation, remote medical procedures, and industrial control applications.
A third variation of distributed processing is edge computing. This layer pushes intelligence, storage, and communications directly to individual devices. Servers, smart sensors, and billions of end-user devices located near or on the edge enable immediate communication with the ability to instantaneously perform simple tasks and basic analysis or send requests to the fog or cloud for more intensive analysis. Edge computing allows the core data center and cloud to be more efficient by off-loading simple tasks to edge devices, thus reducing bandwidth requirements. The core is reserved for computation-intensive applications.
Innovation continues to be applied at the edge. IBM recently announced its Edge Application Manager, which uses artificial intelligence (AI) to manage up to 10,000 edge applications. This service will verify that edge applications are secure and updated to the latest revision. Cloud, edge, and fog computing each represent different opportunities and challenges for connector manufacturers.
In order to support the growing demand for cloud services, central data centers must continue to upgrade their equipment, generating demand for faster servers and switches with increased input/output (I/O) panel density. Copper and fiber optic cables from server to top of rack and rack to rack will likely grow, as well as internal serializer/deserializer (SerDes) to I/O panel links. Mid-board and pluggable optical transceivers as well as leading-edge backplane interfaces should also see a boost. Anticipated data center upgrades to 400GbE and eventually 800GbE will stimulate additional sales of connector-rich equipment in the data center. Broad application of AI will also drive hardware expansion. Given the relatively limited number of large central data centers, the sales volumes of these most advanced interconnects may be sold in smaller quantities, but at a considerably higher unit price.
As more users transition to cloud computing infrastructure, there will be less demand for lower to mid-range computers and power conditioning equipment that have traditionally been installed at individual company sites. Many of the connectors utilized in this class of equipment have been commoditized with reduced profitability. Meanwhile, servers that provide fog computing services will number in the millions and represent large growth potential for interconnects that are similar to those commonly used in the retiring onsite computers. We would expect growth in systems that utilize the most current PCIe, Gen-Z, and CXL interconnect standards.
Devices linked at the edge will incorporate relatively low-cost, low-speed interfaces and may not require continuous internet access for some tasks, as they integrate sufficient computing and storage resources to support simple operations. Over time, however, equipment located at the edge may require higher performance interfaces in order to reap the advantages of reduced latency in applications such as autonomous transportation.
There has also been some debate about where one of these networking hierarchies ends and another begins. Another “mist” level between the fog and edge computing levels has been proposed and has further blurred delineation points. Others do not recognize fog or mist tiers and combine both into the definition of edge computing. Regardless, there is little doubt that explosive growth of IoT devices, connected vehicles, smart grids, smart cities, smart homes, connected healthcare, and machine-to-machine communications will create opportunities for more gateways, routers, switches, IP video cameras, and sensors — all of which incorporate multiple electronic connectors.
The emerging technologies of artificial intelligence and 5G communication, together with cloud, fog, and edge computing, will be essential to fully exploit the capabilities of the commercial and Industrial Internet of Things. Electronic sensors, especially those that will be used in autonomous transportation will demand real-time responses. Fog and edge computing will be able to deliver the reduced latency they require. Advanced “smart sensors” will be able to sense, analyze and respond locally. Off-loading internet traffic to servers closer to the user is the most cost-effective way to increase the efficiency of the network and make it capable of supporting billions of new connected devices. The next likely step will result in bringing AI directly to the edge.
The ability to access data from anywhere 24/7 while minimizing the cost of computing infrastructure continues to drive the adoption of cloud services. Bringing those resources closer to the billions of connected IoT devices will enable a universe of new applications that require real-time response. The expanding layers of edge and fog computing equipment will provide excellent growth opportunities for manufacturers of electronic components, including connectors.
Like this article? Check out our other cloud computing, networking, and data center articles, our Datacom/Telecom Market Page, and our 2020 and 2019 Article Archives.
- Artificial Intelligence: Impact on the Connector Industry - August 13, 2024
- Advanced Packaging and Silicon Photonics Extend Moore’s Law - June 18, 2024
- AI Demands a Fresh Look at System Architecture - June 4, 2024