Kong Open Sourcing New AI Gateway For Developers To Build Multi-LLM Apps

By Amit Chowdhry ● Feb 19, 2024

Kong – a leading developer of cloud API technologies – announced a suite of open-source AI plugins for Kong Gateway 3.6 that can turn any Kong Gateway deployment into an AI Gateway, offering support for multi-Language Learning Models (LLMs) integration. 

By upgrading to Kong Gateway 3.6 (available now), users can access a suite of six new plugins focused on AI and LLM usage. This will enable developers who want to integrate one or more LLMs into their products to be more productive and ship AI capabilities faster while offering architects and platform teams a secure solution that ensures visibility, control, and compliance on every AI request sent by the teams. 

With the tight integration with Kong Gateway, it will now be possible to orchestrate AI flows easily in the cloud or on self-hosted LLMs with industry-leading performance and low latency, which are critical to the performance of AI-based applications.

Builders want the best AI for their use case, and they need to lower the time it takes to integrate different LLMs and want to rapidly iterate and build new capabilities without having to manage the specific cross-cutting requirements around AI usage.

By upgrading to Kong Gateway 3.6, AI builders can access this new suite of plugins entirely focused on AI and LLM usage. And the suite of open-source plugins delivers a range of new capabilities, including:

1.) Multi-LLM Integration – Kong’s “ai-proxy” plugin enables seamless integration of multiple Large Language Model (LLM) implementations, offering native support for industry leaders including OpenAI, Azure AI, Cohere, Anthropic, Mistral, and LLAMA. And the standardized interface allows for simple switching between LLMs without modifying application code, facilitating the use of diverse models and rapid prototyping.

2.) Central AI Credential Management – The “ai-proxy” helps ensure secure and centralized storage of AI credentials within Kong Gateway. And this design negates the need for credentials within applications, streamlining credential rotation and updates directly from the gateway 

3.) Layer 7 AI Metrics Collection – Utilizing the “ai-proxy” plugin, users can now capture detailed Layer 7 AI analytics. And this includes metrics such as request and response token counts, along with usage data for LLM providers and models. Integration with third-party platforms like Datadog, New Relic, and existing logging plugins in Kong Gateway, like TCP, Syslog, and Prometheus, is supported, enriching observability and offering insights into developer preferences. 

4.) No-Code AI Integration – With the “ai-request-transformer” and “ai-response-transformer” plugins, AI capabilities are injected into API requests and responses without a single line of code. And this enables on-the-fly transformations like real-time API response translations for internationalization, enriching and converting API traffic effortlessly. 

5.) AI Prompt Decoration – The “ai-prompt-decorator” plugin enables the consistent configuration of AI prompt contexts, automating the inclusion of rules and instructions with each AI request to enforce organizational compliance and restrict discussions on sensitive topics.


6.) AI Prompt Firewall – The “ai-prompt-guard” offers a governance layer, establishing rules to authorize or block free-form prompts created by applications. And this helps ensure that prompts adhere to approved standards before being transmitted to LLM providers. 


7.) Comprehensive AI Egress with Extensive Features – Integrating these AI capabilities within Kong Gateway centralizes the management, security, and monitoring of AI traffic. And it utilizes over 1,000+ existing official and community plugins for robust access control, rate limiting, and the creation of advanced traffic control rules. This AI Gateway is equipped from day one with all Kong Gateway features, making it one of the most capable in the AI ecosystem.

Through this release, Kong’s expertise in modern API infrastructure extends to AI-driven use cases. And after extensive research and collaboration with select customers and users of Kong Gateway, these newly released plugins address what we believe are the most prevalent AI use cases. 

KEY QUOTES:

“Today marks a significant milestone in our journey towards democratizing AI for developers and enterprises worldwide. By open-sourcing this suite of innovative AI capabilities, including no-code AI plugins, we’re removing the barriers to AI adoption and making it possible for developers to leverage multiple LLMs effortlessly and ship AI powered applications faster. At the same time, we’re providing governance and visibility to all the AI traffic that is being generated by an organization.“ 

  • Marco Palladino, Chief Technology Officer and Co-Founder, Kong

 

Exit mobile version