Back to Baml

Client Constructor

fern/snippets/client-constructor.mdx

0.222.04.2 KB
Original Source
<ParamField path="provider" type="string" required> This configures which provider to use. The provider is responsible for handling the actual API calls to the LLM service. The provider is a required field.

The configuration modifies the URL request BAML runtime makes.

Provider NameDocsNotes
anthropicAnthropicSupports /v1/messages endpoint
aws-bedrockAWS BedrockSupports Converse and ConverseStream endpoint
google-aiGoogle AISupports Google AI's generateContent and streamGenerateContent endpoints
vertex-aiVertex AISupports Vertex's generateContent and streamGenerateContent endpoints
openaiOpenAISupports /chat/completions endpoint
openai-responsesOpenAI Responses APISupports OpenAI's most advanced /responses endpoint
azure-openaiAzure OpenAISupports Azure's /chat/completions endpoint
openai-genericOpenAI (generic)Any other provider that supports OpenAI's /chat/completions endpoint

A non-exhaustive list of providers you can use with openai-generic:

Inference ProviderDocs
Azure AI FoundryAzure AI Foundry
GroqGroq
Hugging FaceHugging Face
Keywords AIKeywords AI
LitellmLitellm
LM StudioLM Studio
OllamaOllama
OpenRouterOpenRouter
Vercel AI GatewayVercel AI Gateway
TogetherAITogetherAI
Unify AIUnify AI
vLLMvLLM

We also have some special providers that allow composing clients together:

Provider NameDocsNotes
fallbackFallbackUsed to chain models conditional on failures
round-robinRound RobinUsed to load balance
</ParamField> <ParamField path="options" type="dict[str, Any]" required> These vary per provider. Please see provider specific documentation for more information. Generally they are pass through options to the POST request made to the LLM. </ParamField>