crates/durable-tools/README.md
A Rust library for defining and executing tools in a durable execution environment, backed by the durable crate.
This crate provides abstractions for building AI agent tools with durable execution guarantees. It supports two types of tools:
TaskTool: Durable tools that run as full durable tasks. They can call other tools, checkpoint progress, and spawn subtasks.SimpleTool: Lightweight tools that run inside a TaskTool's step() checkpoint. Simpler to implement but cannot call other tools.All tools implement the ToolMetadata trait for metadata, plus either TaskTool or SimpleTool for execution.
ToolMetadataProvides the tool's name, description, and parameter types. The parameter schema is automatically derived from LlmParams:
pub trait ToolMetadata: Send + Sync + 'static {
type LlmParams: Serialize + DeserializeOwned + JsonSchema + Send + Sync + 'static;
type SideInfo: SideInfo;
type Output: Serialize + DeserializeOwned + Send + 'static;
fn name(&self) -> Cow<'static, str>;
fn description(&self) -> Cow<'static, str>;
// Automatically derived from LlmParams - override only if needed
fn parameters_schema(&self) -> Schema { ... }
}
TaskToolFor complex, durable operations that may need to call other tools or checkpoint progress:
#[async_trait]
pub trait TaskTool: ToolMetadata {
async fn execute(
llm_params: <Self as ToolMetadata>::LlmParams,
side_info: <Self as ToolMetadata>::SideInfo,
ctx: &mut ToolContext,
) -> ToolResult<<Self as ToolMetadata>::Output>;
}
SimpleToolFor simple, stateless operations like API calls or database queries:
#[async_trait]
pub trait SimpleTool: ToolMetadata {
async fn execute(
llm_params: <Self as ToolMetadata>::LlmParams,
side_info: <Self as ToolMetadata>::SideInfo,
ctx: SimpleToolContext<'_>,
idempotency_key: &str,
) -> ToolResult<<Self as ToolMetadata>::Output>;
}
| Type | Description |
|---|---|
ToolExecutor | High-level orchestrator for registering and spawning tools |
ToolExecutorBuilder | Builder for configuring ToolExecutor |
ToolContext | Context passed to TaskTool::execute() with checkpointing and tool-calling capabilities |
SimpleToolContext | Simplified context passed to SimpleTool::execute() with database and inference access |
ToolRegistry | Registry of tools for lookup and OpenAI function schema generation |
ToolAppState | Application state passed to all tools (pool, registry, inference client) |
DurableClient | Type alias for Durable<ToolAppState> |
ToolError / ToolResult | Error types for tool execution |
SideInfo | Marker trait for side information types (hidden from LLM) |
TensorZeroClient | Trait for TensorZero inference backends |
TensorZeroClientError | Error type for TensorZero client operations |
Tools have two parameter types that serve different purposes:
LlmParams: Parameters intended to be generated by a tool-calling LLM. The JSON schema shown to LLMs is generated from this type (via JsonSchema derive). This is what the LLM sees and fills in when calling the tool.
SideInfo: Internal context passed at spawn time, hidden from the LLM. Not included in the tool's JSON schema. Use () if no side information is needed.
// Parameters the LLM fills in (schema generated from this)
#[derive(Serialize, Deserialize, JsonSchema)]
struct MyToolParams {
query: String,
}
// Internal context hidden from LLM
#[derive(Serialize, Deserialize)]
struct MyToolContext {
session_id: Uuid,
}
impl SideInfo for MyToolContext {}
impl ToolMetadata for MyTool {
type LlmParams = MyToolParams; // LLM sees this
type SideInfo = MyToolContext; // Hidden from LLM
type Output = MyToolOutput;
// ...
}
// At spawn time, both are provided as JSON:
executor.spawn_tool_by_name(
"my_tool",
serde_json::to_value(llm_params)?, // From LLM tool call
serde_json::to_value(side_info)?, // Internal context
episode_id,
).await?;
Tools can be wrapped with context management strategies to handle large outputs that might overwhelm LLM context windows. The ctx parameter namespace is reserved for context management parameters (filtering, pagination, etc.) and should not be used by inner tools.
The crate re-exports commonly needed types:
async_trait - For implementing tool traitsschemars - For parameter schema generationSpawnOptions, SpawnResult, TaskHandle, WorkerOptions - From durablehttp_gateway_client, embedded_gateway_client - Inference client constructorsClientInferenceParams, InferenceParams, InferenceResponse, Input, InputMessage, InputMessageContent, Role, TensorZeroErroruse durable_tools::{
SimpleTool, TaskTool, ToolContext, SimpleToolContext, ToolMetadata,
ToolExecutor, ToolResult, async_trait, WorkerOptions,
http_gateway_client,
};
use schemars::JsonSchema;
use secrecy::SecretString;
use serde::{Deserialize, Serialize};
use std::borrow::Cow;
use uuid::Uuid;
// Define a SimpleTool
#[derive(Serialize, Deserialize, JsonSchema)]
struct SearchParams { query: String }
#[derive(Serialize, Deserialize)]
struct SearchResult { results: Vec<String> }
#[derive(Default)]
struct SearchTool;
impl ToolMetadata for SearchTool {
type SideInfo = ();
type Output = SearchResult;
type LlmParams = SearchParams;
fn name(&self) -> Cow<'static, str> {
Cow::Borrowed("search")
}
fn description(&self) -> Cow<'static, str> {
Cow::Borrowed("Search the web")
}
// parameters_schema() is automatically derived from LlmParams
}
#[async_trait]
impl SimpleTool for SearchTool {
async fn execute(
llm_params: <Self as ToolMetadata>::LlmParams,
_side_info: <Self as ToolMetadata>::SideInfo,
_ctx: SimpleToolContext<'_>,
_idempotency_key: &str,
) -> ToolResult<<Self as ToolMetadata>::Output> {
// Implementation...
Ok(SearchResult { results: vec![] })
}
}
// Define a TaskTool that calls the SimpleTool
#[derive(Serialize, Deserialize, JsonSchema)]
struct ResearchParams { topic: String }
#[derive(Serialize, Deserialize)]
struct ResearchResult { summary: String }
struct ResearchTool;
impl ToolMetadata for ResearchTool {
type SideInfo = ();
type Output = ResearchResult;
type LlmParams = ResearchParams;
fn name(&self) -> Cow<'static, str> {
Cow::Borrowed("research")
}
fn description(&self) -> Cow<'static, str> {
Cow::Borrowed("Research a topic")
}
// parameters_schema() is automatically derived from LlmParams
}
#[async_trait]
impl TaskTool for ResearchTool {
async fn execute(
llm_params: <Self as ToolMetadata>::LlmParams,
_side_info: <Self as ToolMetadata>::SideInfo,
ctx: &mut ToolContext,
) -> ToolResult<<Self as ToolMetadata>::Output> {
// Call another tool
let _search = ctx
.call_tool("search", serde_json::json!({"query": llm_params.topic}))
.await?;
// Use a checkpointed step
let summary = ctx
.step("summarize", (), |(), _state| async {
Ok("Summary of results".to_string())
})
.await?;
Ok(ResearchResult { summary })
}
}
// Setup and run
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let t0_client = http_gateway_client(url::Url::parse("http://localhost:3000")?)?;
let executor = ToolExecutor::builder()
.database_url(std::env::var("DATABASE_URL")?.into())
.queue_name("tools")
.t0_client(t0_client)
.register_simple_tool_instance(SearchTool)?
.register_task_tool_instance(ResearchTool)?
.build()
.await?;
// Create the queue (required before spawning)
executor.durable().create_queue(None).await?;
// Spawn a tool execution by name
let episode_id = Uuid::now_v7();
executor.spawn_tool_by_name(
"research",
serde_json::json!({"topic": "rust"}),
serde_json::json!(null), // No side info
episode_id,
).await?;
// Start a worker to process tasks
let worker = executor.start_worker(WorkerOptions::default()).await;
// ... worker processes tasks until shutdown
worker.shutdown().await;
Ok(())
}
cargo test --lib
The integration tests use #[sqlx::test] with the durable migrator to automatically set up the database schema.
DATABASE_URL="postgres://postgres:postgres@localhost:5433/test" cargo test --test integration
DATABASE_URL="postgres://postgres:postgres@localhost:5433/test" cargo test
Unit tests (34 tests in src/tests.rs):
is_durable(), iter() with Tool::try_fromToolError <-> TaskErrorIntegration tests (5 tests in tests/integration.rs):
execute_erased serialization/deserialization