The Next Cloud Computing Revolution is Closer Than You Think

The move to a distributed cloud computing infrastructure will remove the constraints of today’s architecture, opening up possibilities for emerging technologies.

The move to a distributed cloud computing infrastructure will remove the constraints of today’s architecture.
(Credit: Andrii Yalanskyi / Alamy Stock Photo)

In the not-so-distant future, the daily routines of millions of people will evolve—still familiar, yet, under the surface, transformed. All of this will be dependent on a revolution in cloud computing.

Imagine a new generation of spatial computing products that are unintrusive (think Apple Vision Pro shrunk down to the form factor of ordinary glasses) and that help you navigate the physical world in a way that connects you to your surroundings rather than causing isolation. Imagine new AI agents that truly work for you, taking care of mundane tasks so you can focus on creativity and real-world connections. And imagine your car interacting with smart urban infrastructure to streamline your journey and make it safer. The future isn't just coming—we’re racing toward it.

To adopt these technologies at a scale that is anywhere close to that of the smartphone or laptop, there must be a sea change in the underlying cloud computing infrastructure.

The Cloud Computing Evolution Continues

In the early days of the Internet, centralized servers handled all traffic. Scaling a website meant buying more and bigger servers. This first internet epoch ended with the move away from this monolithic model to one using less expensive, distributed servers and software to optimize web traffic. Instead of scaling vertically at a single, central point, we could now scale horizontally in a far more cost-effective way. The web would have collapsed under its own weight without this shift away from centralized servers. Decentralization and content distribution made the web work.

We’re at a similar tipping point, fast approaching a world where centralized computing resources can no longer support the demands of our technology. Many of these changes are well underway. By 2025, edge devices will create more than 90 zettabytes (90 trillion gigabytes) of data. Our current internet infrastructure can support the more than 17 billion connected devices in use today—but will it support nearly twice that number of connected devices in 2030?

In addition, while the uptake of AI has been astounding, we’ve barely scratched the surface. Data shows that generative AI adoption has increased more than two times faster than smartphones and tablets. The market is set to reach a staggering $140 billion by 2030—that’s a whole lot of LLMs. Tech giants like Alphabet, Amazon, Meta, and Microsoft have pledged to collectively spend nearly $200 billion this year, primarily on data centers, chips, and other gear to build, train, and deploy generative AI models. But will this mostly centralized infrastructure be able to scale with adoption?

New Applications Need a Cloud Computing Upgrade

We can’t expect a centralized data center model—one that was intended for the original web and cloud services of the 2000s—to withstand the forthcoming unfathomable strain that will come with AI alone, not to mention spatial computing devices, smart vehicles, and even normal data growth. These data centers may be hundreds or thousands of miles away from the end user or device. That distance, plus heavier workloads, translates into more latency.

Waiting a second or two for a website to load when you’re shopping online is a recipe for customer frustration and cart abandonment, but latency is even more damaging in the real-time interactions described above. From spatial computing to AI-automated interactions and urban traffic management, the key to success is low-latency real-time interactions between devices and people, all at an enormous scale.

What will this new computing infrastructure look like? We have a few options—though I believe there’s one clear winner.

  1. First, consumers could adapt to computing-hungry, bulky devices that have enough power to support themselves instead of connecting to the cloud or edge.

  2. Another option would be to sit around and wait for computing power to get small enough to fit into more streamlined devices.

  3. The third—and what I see as the most realistic—option is to embrace a distributed, decentralized cloud.

Instead of the 20 or so regional cloud data centers in use by most cloud providers today, hundreds (and ultimately thousands) of powerful computing points will be distributed around the world, close to users and their devices. Processing power will dynamically meet the needs of specific applications, with compute infrastructure—CPU and GPU—closely matched to the workload requirements. Workloads can communicate with computing points and back to devices in milliseconds, enabling real-time applications that require high levels of computing intelligence with ultra-low latency.

The New Norm in Cloud Computing

This revolution in distributed computing is not a case of "build it, and they will come." Strong market forces are driving it.

Consider where global tech companies are investing. Microsoft, at its recent Build conference, focused on new, sophisticated user experiences that will benefit from edge computing. Google Distributed Cloud (GDC) is promoting its “AI anywhere” capability, complementing its Gemini AI models. Oracle’s Roving Edge Infrastructure extends cloud services to the network edge. Apple Intelligence will leverage AI on such a large scale—on-device for some tasks and in a private cloud for more complex computations—that it will require massive amounts of networking to function properly. And AWS is touting edge services to deploy APIs and tools beyond its data centers.

These are signs that we're moving in the right direction. However, the real revolution will bring powerful, purpose-built computing to many more points of presence than is currently envisioned by technology companies that are still invested in a limited distribution model.

A Final Word

Ultimately, highly distributed computing will be so ubiquitous that it will be transparent to users. But without the move to distributed computing infrastructure, the constraints of today’s architecture will soon become apparent—disrupting our digital lives and limiting the possibilities of emerging technology.

The foundational stones for a new internet epoch are already in place and could drive exciting technical advancements, weaving spatial computing, AI agents, and smart urban infrastructure into our daily lives. Now, it is the time to bring powerful computing to a location near you.

About the Author

Dr. Robert Blumofe, Chief Technology Officer, Akamai Technologies

Dr. Robert Blumofe is the Chief Technology Officer of Akamai Technologies, a cloud computing, cybersecurity, and content delivery company.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights