Akamai Technologies has introduced Akamai Inference Cloud, a global distributed platform engineered to deliver real-time, low-latency, and secure AI inferencing at the edge. Built on NVIDIA’s Blackwell architecture and powered by Akamai’s expansive global network, the new platform represents a significant advancement in deploying artificial intelligence workloads closer to users and devices.
Akamai Inference Cloud is designed to meet the growing demand for high-performance, decentralized AI infrastructure, enabling organizations to execute inference tasks at the edge without sacrificing speed, scalability, or data privacy. The platform integrates NVIDIA RTX PRO servers and NVIDIA BlueField-3 DPUs, combining powerful compute and networking acceleration to handle compute-intensive AI workloads efficiently across 4,200 edge locations worldwide.
This distributed architecture enables ultra-low-latency processing, essential for emerging agentic and autonomous workloads, including autonomous agents, real-time financial systems, robotics, industrial automation, and intelligent IoT networks. By situating AI inference closer to end users, Akamai reduces data-transit bottlenecks common in centralized cloud systems, enabling faster decision-making, greater reliability, and higher operational efficiency.
The launch of Akamai Inference Cloud further expands Akamai’s role in the AI infrastructure ecosystem, extending beyond its traditional strengths in content delivery and cybersecurity to support next-generation AI-driven applications. With its unique combination of edge proximity, global reach, and NVIDIA-powered performance, the platform enables developers and enterprises to build and deploy real-time, mission-critical AI systems that demand both scale and immediacy.
Through this new offering, Akamai aims to empower businesses to bring intelligence directly to where data is generated, transforming how AI models are deployed and executed across distributed environments worldwide.
KEY QUOTES:
“The next wave of AI requires the same proximity to users that allowed the internet to scale globally. Powered by NVIDIA AI infrastructure, Akamai Inference Cloud will meet this demand by placing AI decision-making in thousands of locations around the world, enabling faster, smarter, and more secure responses.”
Dr. Tom Leighton, CEO and Co-Founder, Akamai Technologies
“Inference has become the most compute-intensive phase of AI — demanding real-time reasoning at planetary scale. Together, NVIDIA and Akamai are moving inference closer to users everywhere, unlocking the next generation of intelligent applications.”
Jensen Huang, Founder and CEO, NVIDIA

