docs/customize/model-providers/top-level/azure.mdx
models: - name: <Model_NAME> provider: azure model: <MODEL_ID> apiBase: <YOUR_DEPLOYMENT_BASE> apiKey: <YOUR_AZURE_API_KEY> # If you use subscription key, try using Azure gateway to rename it apiKey env: deployment: <YOUR_DEPLOYMENT_NAME> apiType: azure-foundry # Or "azure-openai" if using OpenAI models apiVersion: 2023-07-01-preview # Azure API version
</Tab>
<Tab title="JSON (Deprecated)">
```json title="config.json"
{
"models": [{
"title": "<MODEL_NAME>",
"provider": "azure",
"model": "<MODEL_ID>",
"apiBase": "<YOUR_DEPLOYMENT_BASE>",
"deployment": "<YOUR_DEPLOYMENT_NAME>",
"apiKey": "<YOUR_AZURE_API_KEY>", // If you use subscription key, try using Azure gateway to rename it apiKey
"apiType": "azure-foundry" // Or "azure-openai" if using OpenAI models
}]
}
Azure OpenAI Service requires a handful of additional parameters to be configured, such as a deployment name and API base URL.
To find this information in Azure AI Foundry, first select the model that you would like to connect. Then visit Endpoint > Target URI.
For example, a Target URI of https://just-an-example.openai.azure.com/openai/deployments/gpt-4o-july/chat/completions?api-version=2023-03-15-preview would map to the following:
models:
- name: <MODEL_NAME>
model: <MODEL_ID>
provider: azure
apiBase: https://just-an-example.openai.azure.com
apiKey: <YOUR_AZURE_API_KEY>
env:
apiVersion: <API_VERSION>
deployment: <MODEL_DEPLOYMENT>
apiType: azure-openai
```