fern/01-guide/04-baml-basics/error-handling.mdx
When BAML raises an exception, it will be an instance of a subclass of BamlError. This allows you to catch all BAML-specific exceptions with a single except block.
try: b.CallFunctionThatRaisesError() except BamlError as e: print(e)
try: b.CallFunctionThatRaisesError() except BamlValidationError as e:
print(e.prompt)
print(e.raw_output)
print(e.message)
print(e.detailed_message)
```typescript TypeScript
import { b } from './baml_client'
// For catching parsing errors and cancellation errors, you can import these
import { BamlValidationError, BamlClientFinishReasonError, BamlAbortError } from '@boundaryml/baml'
// The rest of the BAML errors contain a string that is prefixed with:
// "BamlError:"
// Subclasses are sequentially appended to the string.
// For example, BamlInvalidArgumentError is returned as:
// "BamlError: BamlInvalidArgumentError:"
// Or, BamlClientHttpError is returned as:
// "BamlError: BamlClientError: BamlClientHttpError:"
async function example() {
try {
await b.CallFunctionThatRaisesError()
} catch (e) {
if (e instanceof BamlAbortError) {
// Handle cancellation
console.log('Operation was cancelled:', e.message)
console.log('Cancellation reason:', e.reason)
} else if (e instanceof BamlValidationError || e instanceof BamlClientFinishReasonError) {
// You should be lenient to these fields missing.
// The original prompt sent to the LLM
console.log(e.prompt)
// The LLM response string
console.log(e.raw_output)
// A human-readable error message
console.log(e.message)
// Complete error history (includes fallback attempts)
console.log(e.detailed_message)
} else {
// Handle other BAML errors
console.log(e)
}
}
}
// Error handling support coming soon for Go
// Currently, Go functions return standard (non-typed) Go errors
# Example coming soon
use myproject::baml_client::sync_client::B;
fn main() {
match B.CallFunctionThatRaisesError.call() {
Ok(result) => println!("Result: {:?}", result),
Err(e) => {
// All BAML errors implement Display and Debug
eprintln!("Error: {}", e);
// Use Debug format for detailed error info
eprintln!("Details: {:?}", e);
}
}
}
Base class for all BAML exceptions.
<ParamField path="message" type="string"
A human-readable error message. </ParamField>
Subclass of BamlError.
Raised when one or multiple arguments to a function are invalid.
Subclass of BamlError.
Raised when a client fails to return a valid response.
<Warning> In the case of aggregate clients like `fallback` or those with `retry_policy`, only the last client's error **type** is raised. However, the complete history of all failed attempts is preserved in the `detailed_message` field, allowing you to debug the entire fallback chain. </Warning>Subclass of BamlClientError.
Raised when the HTTP request made by a client fails with a non-200 status code.
<ParamField path="status_code" type="int"
The status code of the response.
Common status codes are:
Subclass of BamlClientError.
Raised when the finish reason of the LLM response is not allowed.
<ParamField path="finish_reason" type="string"
The finish reason of the LLM response. </ParamField>
<ParamField path="message" type="string"
An error message. </ParamField>
<ParamField path="prompt" type="string"
The original prompt that was sent to the LLM, formatted as a plain string. Images sent as base64-encoded strings are not serialized into this field. </ParamField>
<ParamField path="raw_output" type="string"
The raw text from the LLM that failed to parse into the expected return type of a function. </ParamField>
<ParamField path="detailed_message" type="string"
Comprehensive error information that includes the complete history of all failed attempts when using fallback clients or retry policies. When multiple attempts are made, this field contains formatted details about each failed attempt, making it invaluable for debugging complex client configurations. </ParamField>
Subclass of BamlError.
Raised when BAML fails to parse a string from the LLM into the specified object.
<ParamField path="raw_output" type="string"
The raw text from the LLM that failed to parse into the expected return type of a function. </ParamField>
<ParamField path="message" type="string"
The parsing-related error message. </ParamField>
<ParamField path="prompt" type="string"
The original prompt that was sent to the LLM, formatted as a plain string. Images sent as base64-encoded strings are not serialized into this field. </ParamField>
<ParamField path="detailed_message" type="string"
Comprehensive error information that includes the complete history of all failed attempts when using fallback clients or retry policies. When multiple attempts are made, this field contains formatted details about each failed attempt, making it invaluable for debugging complex client configurations. </ParamField>
Subclass of BamlError.
Raised when a BAML operation is cancelled via an abort controller.
<ParamField path="message" type="string"
A message describing why the operation was aborted. </ParamField>
<ParamField path="reason" type="any"
Optional additional context about the cancellation. This can be any value provided when calling the abort() method.
</ParamField>
When operations are cancelled via abort controllers, specific errors are thrown:
<CodeGroup> ```python Python from baml_client import b from baml_py import AbortController, BamlAbortErrorasync def example(): controller = AbortController()
# Cancel after 5 seconds
async def cancel_after_timeout():
await asyncio.sleep(5)
controller.abort('timeout')
asyncio.create_task(cancel_after_timeout())
try:
result = await b.ExtractData(
input_text,
baml_options={"abort_controller": controller}
)
except BamlAbortError as e:
if e.reason == 'timeout':
print("Operation timed out after 5 seconds")
else:
print(f"Operation was cancelled: {e.message}")
except BamlValidationError as e:
print(f"Validation failed: {e.message}")
```typescript TypeScript
import { b } from './baml_client'
import { BamlAbortError } from '@boundaryml/baml'
async function example() {
const controller = new AbortController()
// Cancel after 5 seconds
setTimeout(() => controller.abort('timeout'), 5000)
try {
const result = await b.ExtractData(inputText, {
abortController: controller
})
} catch (e) {
if (e instanceof BamlAbortError) {
if (e.reason === 'timeout') {
console.log('Operation timed out after 5 seconds')
} else {
console.log(`Operation was cancelled: ${e.message}`)
}
} else if (e instanceof BamlValidationError) {
console.log(`Validation failed: ${e.message}`)
}
}
}
import (
"context"
"errors"
"time"
)
func example() {
// Create context with 5 second timeout
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
defer cancel()
result, err := b.ExtractData(ctx, inputText)
if err != nil {
if errors.Is(err, context.DeadlineExceeded) {
fmt.Println("Operation timed out after 5 seconds")
} else if errors.Is(err, context.Canceled) {
fmt.Println("Operation was cancelled")
} else {
// Handle other errors
fmt.Printf("Error: %v\n", err)
}
}
}
begin
controller = Baml::AbortController.new
# Cancel after 5 seconds in another thread
Thread.new do
sleep(5)
controller.abort('timeout')
end
result = b.extract_data(
input_text,
baml_options: { abort_controller: controller }
)
rescue Baml::AbortError => e
if e.reason == 'timeout'
puts "Operation timed out after 5 seconds"
else
puts "Operation was cancelled: #{e.message}"
end
rescue Baml::ValidationError => e
puts "Validation failed: #{e.message}"
end
use baml::CancellationToken;
use myproject::baml_client::sync_client::B;
use std::time::Duration;
fn main() {
// Cancel after 5 seconds
let token = CancellationToken::new_with_timeout(Duration::from_secs(5));
let result = B.ExtractData
.with_cancellation_token(Some(token))
.call(input_text);
match result {
Ok(data) => println!("Result: {:?}", data),
Err(e) => {
let error_str = format!("{:?}", e).to_lowercase();
if error_str.contains("cancel") || error_str.contains("timeout") {
println!("Operation timed out after 5 seconds");
} else {
eprintln!("Error: {}", e);
}
}
}
}
For more information on using abort controllers, see the Abort Controllers guide.
Our parser is very forgiving, allowing for structured data parsing even in the presence of minor errors and thought tokens in the LLM response. However, certain types of errors are too ambiguous to handle without the help of an LLM.
In cases where your LLM is having trouble producing valid data from the output schema, you can use this 'fixup' recipe to get valid data:
Foo and it
returns MyClass:function FixupFoo(errorMessage: string) -> MyClass {
client GPT4o
prompt #"
Fix this malformed JSON. Preserve the same information.
{{ ctx.output_format }}
Original data and parse error:
{{ errorMessage }}
"#
}
async function example() { try { const result = await b.Foo(myData) } catch (e) { if (e instanceof BamlValidationError) { const result = await b.FixupFoo(JSON.stringify(e)) } } }
```go Go
// Example coming soon.
begin
result = b.foo(my_data)
rescue Baml::ValidationError => e
result = b.fixup_foo(JSON.generate(e))
end
use myproject::baml_client::sync_client::B;
let result = match B.Foo.call(my_data) {
Ok(result) => result,
Err(e) => {
// Attempt fixup on errors
B.FixupFoo.call(&format!("{:?}", e)).unwrap()
}
};
LLMs are good at reconstituting data, so it is often possible to use a less powerful model for your fixup function than the model you used to produce the original data. The difficulty of producing valid JSON data depends on the complexity of the schema and the details of your data payload, so be sure to test your fixup function on realistic data payloads before moving to a smaller model.