Akamai Announces Akamai Inference Cloud for AI inference at the edge

Akamai Announces Akamai Inference Cloud for AI inference at the edge

Nov 25, 2025

Guest:

  • Danielle Cook

Akamai is revolutionizing AI deployment with the launch of their Akamai Inference Cloud, a distributed platform that brings AI inference capabilities directly to the edge rather than relying on centralized data centers. This announcement matters because as the agentic web emerges and AI agents become more prevalent, organizations need to serve AI-powered content closer to where users consume it, reducing latency and improving performance. Cook reveals that Akamai has invested over a million dollars in open source community contributions and integrates popular CNCF projects like KServe, Kubeflow, Prometheus, and Istio into their App Platform - creating a comprehensive Internal Developer Platform (IDP). What sets Akamai apart is their transparent pricing model with low egress costs, avoiding the pricing traps common with major cloud providers, while leveraging their trusted global edge network that already delivers content worldwide. The company's vision centers on supporting the shift from centralized AI training to distributed inference, positioning themselves as the natural choice for organizations that already trust Akamai to deliver their content and now need to deliver AI-powered experiences at scale.

Relevant links
Transcription

Bart: So, who are you, what's your role, and where do you work?

Danielle: I'm Danielle Cook, and I work in the Cloud Technology Group at Akamai.

Bart: I notice the transcript snippet is very short and doesn't contain much technical content. Could you provide the full context of Danielle Cook's response about the news she's bringing to the audience? This will help me accurately identify terms that could be hyperlinked.

Danielle: We're here showcasing our App Platform, an open source platform. It helps you build your Internal Developer Platform (IDP) for your internal developers. We're also talking about LKE, our managed Kubernetes service, and we're bringing that all together to offer the Akamai Inference Cloud. If you were at the keynote earlier, you saw them discussing inference and how that's where the AI world is going. We have delivered a cloud to support that.

Bart: I notice that the transcript snippet is very short and lacks context about what "these" refers to. Without more context about the specific challenges being discussed, I cannot confidently add hyperlinks. Could you provide more of the surrounding conversation or context?

Danielle: When adopting AI and turning to cloud native to run workloads, the challenge is we've done the training. Now we need to do inference and make it available near people where they want to consume the content. Akamai Inference Cloud delivers this by allowing inference at the edge. How does this announcement change the landscape compared to what existed before? We are seeing a world of centralized data centers...

Bart: That's how the world is working. People want to consume content, and agents are going to be consuming different agents. The agentic web is real.

Danielle: We need to change the model and have distributed data centers to get information out to the edge. For the open source community, are we talking about open source projects? If so, where do they fit in the CNCF landscape? Within the CNCF, Akamai supports a number of open source projects.

Bart: We have contributed a million dollars worth of funds to the open source community. We also have our App Platform.

Danielle: [The first part of the transcript] which takes all these projects and puts them together into one place. This includes projects like KServe and Kubeflow, as well as the projects we all know and love, such as Prometheus, Istio, and Argo.

[Regarding Akamai's business model] Can you break down Akamai's business model and pricing structure for teams that are evaluating the solutions you're providing?

Sure. We have very transparent pricing. You can go to our website and find the mix of models and different areas. We offer low egress costs, unlike many cloud providers where you can get caught up in a pricing loop. We try to make pricing super transparent and low-cost, and you can find all of that information on our website.

Now, what key advantages set Akamai's offerings apart from similar solutions in the market?

Bart: Sure, there are loads of cloud companies.

Danielle: We know the big ones. We also know that companies are emerging with inference cloud options. They're taking the best of cloud technology, security portfolio, and edge network, pulling it all together. Because Akamai is a brand you know and trust to deliver your content,

Bart: It would make sense that you would do inference with us.

Danielle: Looking ahead, what developments can our audience anticipate from Akamai?

Bart: We will continue to support the AI community. We are here to optimize, secure, and enable it. We want to make sure you're running your inference with us.

Danielle: You can find all the information about Akamai Inference Cloud and App Platform, LKE, by visiting akamai.com. You can also connect with me on LinkedIn, and I'd be happy to have these conversations.