fern/01-guide/05-baml-advanced/client-registry.mdx
If you need to modify the model / parameters for an LLM client at runtime, you can modify the ClientRegistry for any specified function.
result = await b.ExtractResume("...", baml_options={"client": "GPT4"})
import os
from baml_py import ClientRegistry
async def run():
cr = ClientRegistry()
# Creates a new client
cr.add_llm_client(name='MyAmazingClient', provider='openai', options={
"model": "gpt-5-mini",
"temperature": 0.7,
"api_key": os.environ.get('OPENAI_API_KEY')
})
# Creates a client using the OpenAI Responses API
cr.add_llm_client(name='MyResponsesClient', provider='openai-responses', options={
"model": "gpt-4.1",
"api_key": os.environ.get('OPENAI_API_KEY')
})
# Sets MyAmazingClient as the client
cr.set_primary('MyAmazingClient')
# ExtractResume will now use MyAmazingClient as the calling client
res = await b.ExtractResume("...", { "client_registry": cr })
async function run() { const cr = new ClientRegistry() // Creates a new client cr.addLlmClient('MyAmazingClient', 'openai', { model: "gpt-5-mini", temperature: 0.7, api_key: process.env.OPENAI_API_KEY })
// Creates a client using the OpenAI Responses API
cr.addLlmClient('MyResponsesClient', 'openai-responses', {
model: "gpt-4.1",
api_key: process.env.OPENAI_API_KEY
})
// Sets MyAmazingClient as the client
cr.setPrimary('MyAmazingClient')
// ExtractResume will now use MyAmazingClient as the calling client
const res = await b.ExtractResume("...", { clientRegistry: cr })
}
</Tab>
<Tab title="Ruby" language="ruby">
```ruby
require_relative "baml_client/client"
def run
cr = Baml::ClientRegistry.new
# Creates a new client
cr.add_llm_client(
'MyAmazingClient',
'openai',
{
model: 'gpt-5-mini',
temperature: 0.7,
api_key: ENV['OPENAI_API_KEY']
}
)
# Creates a client using the OpenAI Responses API
cr.add_llm_client(
'MyResponsesClient',
'openai-responses',
{
model: 'gpt-4.1',
api_key: ENV['OPENAI_API_KEY']
}
)
# Sets MyAmazingClient as the client
cr.set_primary('MyAmazingClient')
# ExtractResume will now use MyAmazingClient as the calling client
res = Baml.Client.extract_resume(input: '...', baml_options: { client_registry: cr })
end
# Call the asynchronous function
run
package main
import (
"context"
"fmt"
"os"
"github.com/boundaryml/baml"
)
func main() {
ctx := context.Background()
// Create a client registry
cr := baml.NewClientRegistry()
// Creates a new client
err := cr.AddLLMClient("MyAmazingClient", "openai", map[string]interface{}{
"model": "gpt-5-mini",
"temperature": 0.7,
"api_key": os.Getenv("OPENAI_API_KEY"),
})
if err != nil {
panic(fmt.Sprintf("Failed to add client: %v", err))
}
// Creates a client using the OpenAI Responses API
err = cr.AddLLMClient("MyResponsesClient", "openai-responses", map[string]interface{}{
"model": "gpt-4.1",
"api_key": os.Getenv("OPENAI_API_KEY"),
})
if err != nil {
panic(fmt.Sprintf("Failed to add responses client: %v", err))
}
// Sets MyAmazingClient as the client
cr.SetPrimary("MyAmazingClient")
// ExtractResume will now use MyAmazingClient as the calling client
res, err := baml.ExtractResume(ctx, "...", b.WithClientRegistry(cr))
if err != nil {
panic(fmt.Sprintf("Failed to extract resume: %v", err))
}
fmt.Printf("Result: %+v\n", res)
}
use baml::ClientRegistry;
use myproject::baml_client::sync_client::B;
use std::collections::HashMap;
fn main() {
let mut registry = ClientRegistry::new();
// Creates a new client
let mut options = HashMap::new();
options.insert("model".to_string(), serde_json::json!("gpt-5-mini"));
options.insert("temperature".to_string(), serde_json::json!(0.7));
registry.add_llm_client("MyAmazingClient", "openai", options);
// Sets MyAmazingClient as the client
registry.set_primary_client("MyAmazingClient");
// ExtractResume will now use MyAmazingClient as the calling client
let res = B.ExtractResume
.with_client_registry(®istry)
.call("...")
.unwrap();
println!("Result: {:?}", res);
}
The API supports passing client registry as a field on __baml_options__ in the request body.
Example request body:
{
"resume": "Vaibhav Gupta",
"__baml_options__": {
"client_registry": {
"clients": [
{
"name": "OpenAI",
"provider": "openai",
"retry_policy": null,
"options": {
"model": "gpt-5-mini",
"api_key": "sk-..."
}
},
{
"name": "OpenAIResponses",
"provider": "openai-responses",
"retry_policy": null,
"options": {
"model": "gpt-4.1",
"api_key": "sk-..."
}
}
],
"primary": "OpenAI"
}
}
}
curl -X POST http://localhost:2024/call/ExtractResume \
-H 'Content-Type: application/json' -d @body.json
set_primary MethodThe set_primary method can be called with either one of the clients added to
the ClientRegistry at runtime using add_llm_client or a client defined in
BAML files.
You can, however, define a fallback client and then use ClientRegistry to set
that client as the calling client for BAML functions. Here's a simple example:
function ExtractResume(input: string) -> Resume {
client "openai/gpt-5-mini" // Uses GPT-5 Mini by default
prompt #"
Extract from this content: {{ resume }}
{{ ctx.output_format }}
"#
}
// This client uses GPT-5 first and if it fails it will use Opus 4 as a fallback.
client<llm> GptOpusFallback {
provider fallback
options {
strategy ["openai/gpt-5", "anthropic/claude-opus-4-1-20250805"]
}
}
Then, in your code, you can use the ClientRegistry to set the GptOpusFallback at
runtime, which will try to use GPT 5 and if it fails Opus 4:
let mut registry = ClientRegistry::new(); registry.set_primary_client("GptOpusFallback"); let res = B.ExtractResume.with_client_registry(®istry).call("...").unwrap();
</Tab>
</Tabs>
Now the calling client will be `GptOpusFallback`, **nothing else**.
## ClientRegistry Interface
<Tip>
Note: `ClientRegistry` is imported from `baml_py` in Python and `@boundaryml/baml` in TypeScript, not `baml_client`.
As we mature `ClientRegistry`, we will add a more type-safe and ergonomic interface directly in `baml_client`. See [Github issue #766](https://github.com/BoundaryML/baml/issues/766).
</Tip>
Methods use `snake_case` in Python and `camelCase` in TypeScript.
### add_llm_client / addLlmClient
A function to add an LLM client to the registry.
<ParamField
path="name"
type="string"
required
>
The name of the client.
<Warning>
Using the exact same name as a client also defined in .baml files overwrites the existing client whenever the ClientRegistry is used.
</Warning>
</ParamField>
<Markdown src="/snippets/client-constructor.mdx" />
<ParamField path="retry_policy" type="string">
The name of a retry policy that is already defined in a .baml file. See [Retry Policies](/ref/llm-client-strategies/retry-policy).
</ParamField>
### set_primary / setPrimary
This sets the client for the function to use. (i.e. replaces the `client` property in a function)
<Warning>
The name "primary" does not imply that there will be a "secondary" or fallback
client used anywhere. It simply means that you choose one client from all the
available clients in your `ClientRegistry`.
See [The `set_primary` Method](#the-set_primary-method) section for more details.
</Warning>
<ParamField
path="name"
type="string"
required
>
The name of the client to use.
This can be a new client that was added with `add_llm_client` or an existing client that is already in a .baml file.
</ParamField>