Your Network is the Key to Surviving AI
Organizations that modernize their networking and security infrastructure fast will be in pole position to seize AI opportunities.
January 9, 2025
As the GenAI hype wave gathered momentum in 2024, networking vendors lost no time in declaring their products AI-ready or AI-enabled. Some even crowned themselves leaders in AI networking, though it’s far from clear what that really means.
Behind the AI-washing are three simple facts about the impact of AI on networking:
AI will put new demands on networks, affecting performance, capacity requirements, costs, and operational complexity
Networking vendors will increasingly use AI inside their platforms to improve performance, reliability, and security and to automate network operations
The limitations of existing network architecture in a world where users and applications are everywhere and anywhere were already clear before AI. Post AI, legacy architecture will struggle to cope.
The first of these trends will drive the second. Using AI to enhance existing capabilities will be necessary to make networks resilient in the face of capacity, security, and performance challenges and to make them adaptive as enterprises contend with accelerating pressure to innovate and turn on a dime.
The AI revolution coincides with other fundamental changes to the way we think about and use networks, namely:
Cloud/edge-ification – mass migration of applications from traditional data centers to the cloud has put enterprise applications closer to some users and further away from others
As-a-service delivery – the same flexible consumption model that revolutionized apps, compute and storage is now being played out in networking and security. Infrastructure, networking, and security are now delivered in a pure hardware-less, gateway-less as-a-service model.
Security – with users, devices, and applications distributed, it no longer makes sense to send traffic back to the data center or even distant cloud regions (also known as cloud hairpinning). As security has a profound effect on performance, traffic inspection, and security controls need to be co-located with network access at the edge
Convergence – it no longer makes sense to think of networking and security as separate entities with different management regimes. Networking and security policies determined by the method of access, the location of the user, or the type and location of a resource (SaaS, data center, Internet, etc.) increase complexity and overhead costs. Divergent policies and access controls also increase security risk.
AI networking requirements in brief
You can boil down these requirements to just three words: speed, security, and simplification.
These three goals, which were already highly desirable become critical as AI takes off in earnest in 2025.
The most obvious challenge is the impact of AI on network capacity and performance.
GenAI will consume a lot of data and generate a lot more. Estimates of the impact of AI traffic on networks vary widely, but with network capacity doubling roughly every two years and the optical engines used to drive fiber networks nearing the limits of physics, AI traffic could significantly increase the strain on existing infrastructure in every segment of the network from data center interconnect to the last mile.
For enterprises, the anticipated tsunami of data traffic created by AI will demand network services that are faster, more reliable, easier to scale, and less complex to manage.
Apart from sheer capacity, GenAI applications could have a major impact on network performance. The network will need to move very large quantities of dynamically generated content in real-time. It is already challenging to guarantee performance at the network edge, particularly for remote users and/or devices connecting via consumer-grade broadband, Wi-Fi, or 4/5G. Application performance becomes unacceptable unless you overcome issues of latency and packet loss in the last mile. Add GenAI and the same connections could become unusable.
The good news is that many of the performance problems that AI creates will also be solved by AI. It can be used, for example, to select the nearest available edge (or virtual PoP) to provide ultra-low latency connections to users wherever they are, taking into account factors such as time of day and known traffic patterns.
Not only can this cut latency by an order of magnitude and tremendously improve packet loss, but having no physical PoP (and no fixed IP address to attack) means dynamic connections are inherently more secure. AI can also be used to optimize path selection, resulting in performance and security benefits; and to recover lost packets before they’ve had time to impact user experience.
A final word on AI networking requirements
AI deployments pose numerous problems for CIOs, from concerns about return on investment to issues of bias and regulatory compliance. Security risks are also increasing as AI makes it easier to launch attacks, and AI tools make new kinds of attacks possible.
Getting network infrastructure right won’t directly address all these problems. Not getting it right will certainly make things worse. Organizations that modernize their networking and security infrastructure fast will be in pole position to seize AI opportunities while their competitors are struggling to cope with AI pressures.
Read more about:
Infrastructure for AIAbout the Author
You May Also Like