content/providers/03-community-providers/08-azure-ai.mdx
The Quail-AI/azure-ai-provider enables unofficial integration with Azure-hosted language models that use Azure's native APIs instead of the standard OpenAI API format.
This provider works with any model in the Azure AI Foundry that is compatible with the Azure-Rest AI-inference API.
Install the provider via npm:
npm i @quail-ai/azure-ai-provider
Create an Azure AI resource and set up your endpoint URL and API key. Add the following to your .env file:
AZURE_API_ENDPOINT=https://<your-resource>.services.ai.azure.com/models
AZURE_API_KEY=<your-api-key>
Import createAzure from the package to create your provider instance:
import { createAzure } from '@quail-ai/azure-ai-provider';
const azure = createAzure({
endpoint: process.env.AZURE_API_ENDPOINT,
apiKey: process.env.AZURE_API_KEY,
});
Generate text using the Azure custom provider:
import { generateText } from 'ai';
const { text } = await generateText({
model: azure('your-deployment-name'),
prompt: 'Write a story about a robot.',
});
✅ Chat Completions: Working with both streaming and non-streaming responses
⚠️ Tool Calling: Functionality highly dependent on model choice
⚠️ Embeddings: Implementation present but untested