apps/opik-documentation/documentation/fern/docs-v2/integrations/kong-ai-gateway.mdx
Kong is a popular open-source API gateway that has an AI Gateway.
Kong AI Gateway provides enterprise-grade features for managing LLM API access, including:
You can learn more about the Kong AI Gateway here.
Comet provides a hosted version of the Opik platform. Simply create an account and grab your API Key.
You can also run the Opik platform locally, see the installation guide for more information.
We have developed a Kong plugin that allows you to log all the LLM calls from your Kong server to the Opik platform. The plugin is available for enterprise customers. Please contact our support team for access.
Once the plugin is installed, you can enable it by running:
curl -is -X POST http://localhost:8001/services/{serviceName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "opik-log",
"config": {
"opik_api_key": "<Replace with your Opik API key>",
"opik_workspace": "<Replace with your Opik workspace>"
}
}'
The Opik Kong plugin accepts the following configuration parameters:
opik_api_key: Your Opik API key (required)opik_workspace: Your Opik workspace name (optional)Once configured, you will be able to view all your LLM calls in the Opik dashboard:
<Frame> </Frame>For more information about the Opik Kong plugin, please contact our support team.
If you have suggestions for improving the Kong AI Gateway integration, please let us know by opening an issue on GitHub.