Bye bye, lag. AT&T is embracing “edge computing” to solve the latency problem for future 5G applications, like self-driving cars, robotics, and VR/AR.
The Internet of Things, and the farther reaching application of it, now referred to as the Internet of Everything, or IoE, is shaping up to be a reality within the next few years.
From 500 million devices in 2003 to 12.5 billion in 2010, CISCO expects the number of connected devices to reach 50 billion by 2020.
The economic stakes of the IoE–estimated at over 19 trillion dollars between private and public sectors–push telecommunication companies to prepare their network architecture and rethink their strategy.
The Unsustainability of Cloud Computing
While data centers aren’t going away, soon they’ll become a technology of the past.
From data flow to bandwidths and latency, in the near future traditional data center-based computing won’t be able to provide all of the needs of the IoE age.
The proliferation of connected electronics and the exponential growth of big data bring new stakes in terms of data storage, processing power, and analysis that current data centers can’t support.
These multitudes put a huge strain on data centers, and they can’t just use the cloud the way smartphones are now doing (represented by a one-to-many model).
It is thus necessary to overhaul the architecture of networks by moving from a cloud-based to a distributed architecture that is reinforced at the edge of the network, or what’s known as “edge computing.”
The Inevitable Shift Toward Edge Computing
5G will enable the “edge computing” that will be the backbone of the IoT in general.
In its annual Mobility Report, Ericsson projects that 5G mobile networks would reach 15% of the global population by 2022, connecting everything, from smart home appliances to smart city solutions.
AT&T is aware of the brewing IoT storm and its commercial stakes, hence the communication giant’s heavy investment in edge computing and 5G.
Last year, AT&T collaborated with Ericsson and Intel to conduct the industry’s first trial of 5G for professional use in Austin (Texas). The company conducted a second trial last month, where premium live TV was streamed via 5G, with speeds up to 1 gigabit/S.
Getting its 5G gear up and running, AT&T is setting the stage for more than just an IoT. A couple of days ago, the company revealed its intention to boost third-party applications of 5G.
In a press release, AT&T laid out its plan to move to “edge computing” that would provide real-time processing power with “single-digit millisecond latency”.
Instead of the cloud, data will be processed at the “edge” of the network (like AT&T’s thousands of central offices and cell towers), which would reduce the pressure on network traffic, speed up analysis, and decrease latency.
5G-powered edge computing would enable robots, self-driving cars, wearables, drones, and other latency-sensitive systems like VR/AR to be their own micro data centers. Soon, we’ll have a worldwide network of supercomputers on the move.
So, what do you think? How is (or isn’t) 5g edge computing truly the next step in our IoE evolution?