Advertising

Kong AI Gateway: Unleashing the Power of AI in Enterprise Infrastructure

blankKong Inc., the cloud connectivity company, has released the Kong AI Gateway, an AI-native API gateway designed to govern and secure generative AI workloads across any cloud environment. The product, which was initially launched in beta in February, has already gained significant traction as organizations seek to operationalize generative AI technology.

The Kong AI Gateway offers a range of infrastructure capabilities specifically tailored for AI. These capabilities include support for multiple large language models (LLMs), semantic caching, routing, firewalling, and model lifecycle management. Importantly, the gateway seamlessly integrates with Kong’s existing API platform, allowing enterprises to manage both their traditional APIs and AI initiatives in one place.

Marco Palladino, CTO and co-founder of Kong, explained in an exclusive interview with VentureBeat that organizations and developers are building new generative AI use cases to enhance user and customer experiences. However, scaling these use cases in production without the proper infrastructure can be challenging. Palladino believes that the Kong AI Gateway is the most capable AI infrastructure technology currently available due to its deep AI-specific capabilities built on top of Kong’s existing gateway features. He draws parallels to the early days of APIs, noting that APIs only flourished once the right infrastructure was in place.

The Kong AI Gateway serves as a central hub, providing a unified API interface to manage and secure multiple AI technologies across various applications. It offers a comprehensive set of AI-specific capabilities, including governance, observability, and security. This enables enterprises to effectively deploy and scale their generative AI initiatives.

One key differentiator of the Kong AI Gateway is its ability to introspect AI traffic and provide a unified API for consuming one or more AI providers. While other API management platforms treat LLMs as just another API, Kong goes further by offering prompt security, compliance, governance, templating, and lifecycle management specifically for AI prompts. Additionally, the gateway provides “L7 AI observability metrics” that offer visibility into provider performance, token usage, and costs across AI traffic.

Kong’s unified control plane, Kong Konnect, allows organizations to monetize their fine-tuned AI models alongside traditional APIs. Palladino believes that the next frontier in API monetization lies in monetizing the models themselves. When models are fine-tuned with an organization’s unique corporate intelligence, there is value in harnessing and monetizing that intelligence.

The launch of the Kong AI Gateway comes at a time when interest in generative AI is skyrocketing following the success of OpenAI’s ChatGPT. However, many enterprises are still grappling with how to effectively deploy and govern this technology. Kong aims to simplify the process with its purpose-built AI gateway.

Palladino revealed that demand for the product was overwhelming, resulting in its accelerated release to general availability. Some enterprise customers have already gone into production with the beta version of the gateway.

Kong, formerly known as Mashape, was founded in 2009 and rebranded as Kong in 2017. The company has raised over $170 million in venture funding and powers trillions of API transactions for more than 500 organizations worldwide. With its focus on AI-native infrastructure, Kong is well-positioned to drive the next wave of generative AI adoption in the enterprise.

Palladino emphasized the importance of deploying the right AI infrastructure to unlock generative AI at scale, stating that the Kong AI Gateway fulfills this need.