site/docs/providers/cloudera.md
The Cloudera provider allows you to interact with Cloudera's AI endpoints using the OpenAI protocol. It supports chat completion models hosted on Cloudera's infrastructure.
To use the Cloudera provider, you'll need:
Set up your environment:
export CDP_DOMAIN=your-domain-here
export CDP_TOKEN=your-token-here
Here's a basic example of how to use the Cloudera provider:
providers:
- id: cloudera:your-model-name
config:
domain: your-domain # Optional if CDP_DOMAIN is set
namespace: serving-default # Optional, defaults to 'serving-default'
endpoint: your-endpoint # Optional, defaults to model name
The Cloudera provider supports all the standard OpenAI configuration options plus these additional Cloudera-specific options:
| Parameter | Description |
|---|---|
domain | The Cloudera domain to use. Can also be set via CDP_DOMAIN environment variable. |
namespace | The namespace to use. Defaults to 'serving-default'. |
endpoint | The endpoint to use. Defaults to the model name if not specified. |
Example with full configuration:
providers:
- id: cloudera:llama-3-1
config:
# Cloudera-specific options
domain: your-domain
namespace: serving-default
endpoint: llama-3-1
# Standard OpenAI options
temperature: 0.7
max_tokens: 200
top_p: 1
frequency_penalty: 0
presence_penalty: 0
The following environment variables are supported:
| Variable | Description |
|---|---|
CDP_DOMAIN | The Cloudera domain to use for API requests |
CDP_TOKEN | The authentication token for Cloudera API access |
The Cloudera provider is built on top of the OpenAI protocol, which means it supports the same message format and most of the same parameters as the OpenAI Chat API. This includes:
Example chat conversation:
prompts:
- 'You are a helpful assistant. Answer the following question: {{user_input}}'
providers:
- id: cloudera:llama-3-1
config:
temperature: 0.7
max_tokens: 200
tests:
- vars:
user_input: 'What should I do for a 4 day vacation in Spain?'
If you encounter issues:
CDP_TOKEN and CDP_DOMAIN are correctly set