Akamai announces high-performance and cost-effective AI inference
Discover how Akamai Technologies is positioning itself in the cloud computing landscape, focusing on Kubernetes and AI applications.
In this interview, Ari Weil, VP of Product Marketing at Akamai Technologies, discusses:
Akamai's platform innovations designed to help developers build on Kubernetes and launch AI inference and agentic AI applications with high throughput and low latency
How Akamai differentiates from competitors by being "the world's most distributed cloud computing platform" with integrated security services and a superior price-to-performance ratio
Their commitment to open source and multi-cloud portability, enabling developers to build applications without vendor lock-in while leveraging partnerships with projects across the CNCF ecosystem
Transcription
Bart: I'm Ari Weil, and I work at Akamai Technologies as the Vice President of Product Marketing.
Ari: I'm Ari Weil, head of product marketing and developer advocacy at Akamai.
Bart: Ari Weil from Akamai Technologies is being asked what he would like to share today.
Ari: Akamai is excited to announce innovations on our platform, geared around helping developers build on Kubernetes, using Kubernetes as a stable platform to launch AI inference and agentic AI applications. We're also excited about the conversations we're having with upstream projects across the CNCF. We've given a home to the Linux kernel on the Akamai platform and partnered with exciting projects and startups in the ecosystem, including companies like Fermyon and HarperDB. We're excited to bring the power and potential of this integration with the open source community on a scalable global platform like Akamai's to the community at KubeCon.
Bart: And what problems does the Akamai platform solve?
Ari: Akamai is geared towards any application and use case that requires high throughput, distributed data, and low latency for end-user results. If that's an application that has to process prompts and deliver high-fidelity, individualized, or vertically customized responses, we'll take a small language model or a fine-tuned model, use our infrastructure and global delivery network to power that application and deliver those responses to a user. Additionally, if you have high security requirements like isolation or need to comply with different industry regulations, we also provide powerful technologies built into the platform and service catalog so that you can secure user access, server-to-server communication, and inspect traffic to and from the platform.
Bart: I noticed that the provided transcript excerpt seems incomplete or potentially a placeholder. There isn't enough context to perform a meaningful edit or add hyperlinks. Could you provide the full transcript or a more complete segment of the conversation?
Ari: Before this product announcement, developers were trying to understand how to leverage the cloud for building GenAI or AI inference use cases while ensuring solid performance—that immediacy of response that users anticipate when engaging with something promising a magical AI experience. Now that they can see how easy it is to scale a Kubernetes deployment on our application platform, use right-sized GPUs, and have the flexibility to deploy compute where they want, we're really taking the time and toil out of deploying AI-powered applications. We think application developers will start bringing their use cases to life, which is one of the things we've announced with the Hackathon, where we're already seeing users building on the platform and submitting their designs. We're excited to see what they'll build next.
Bart: Is Akamai Cloud, which we're talking about today, part of open source technologies in the CNCF landscape?
Ari: We are talking about open source technologies and upstream projects that we're enabling. We want to enable multi-cloud portability and support the community in a way that builders can see the benefits of their labor. They can distribute these projects in a way that typically you would only see with proprietary services that are hard to access. We want to make these projects approachable for developers and make the speed and scale of using them for business-critical applications much more attainable for the average development team.
Bart: [Bart Farrell](https://www.linkedin.com/in/bart-farrell-9b17a hausted) asks Ari Weil about Akamai Technologies's business model.
Ari: Akamai supports a pay-as-you-go model with several different services. We have delivery services priced based on the volume delivered on the platform, cloud services based on compute resource consumption, and security apparatus priced according to the number of requests and amount of user activity we would be securing for a business.
Bart: Who are your main competitors?
Ari: Akamai is increasingly competing with various companies and platforms. The public cloud hyperscalers that are centralized in nature are one of our key competitors in the cloud computing market. Up until now, they have been the only option for enterprises wanting to build enterprise-grade and business-critical applications. However, we're finding they don't have the scale and distribution to meet modern use case requirements and customer demands.
We look at edge clouds like CDNs that have evolved to add edge computing capabilities to their platforms. The challenge is that they don't provide the depth of compute services and the access and flexibility to build what you want using compute primitives. They are also focused on proprietary services versus enabling the ecosystem of open source.
If we look at other competitors in the space, we have companies trying to help facilitate Kubernetes deployments and platformization, as well as AI-based businesses. The ecosystem is as fragmented as the competitive landscape for Akamai. But one thing is consistent: developers are looking for a way to realize the benefits of their applications and see the outcomes of their hard work.
We believe Akamai is providing the only platform to build, secure, and scale applications while not compromising on portability and truly embracing the nature of open source.
Bart: What differentiates Akamai Cloud from the competitors that are out there?
Ari: The biggest differentiation is that we are not trying to lock you into our platform. We are trying to help you build whatever you need to do next as easily and quickly as possible. We're facilitating that by embracing the open source community and getting back to Akamai's roots. We are the world's most distributed cloud computing platform, with integrated services for distribution and security that are really unmatched in the industry. We do all of that with a price-to-performance ratio that you are not going to realize on any other cloud. That's where we think developers and their teams are really getting excited about partnering with Akamai for whatever they build in the future.
Bart: How can people get in touch with Ari Weil if they have any questions?
Ari: If you have questions or want to see how you can build on the Akamai platform, visit linode.com or cloud.akamai.com and get started today.