content/providers/03-community-providers/23-letta.mdx
The Letta AI SDK provider allows you to use Letta agents with the AI SDK.
Features include:
See the Letta documentation for the complete list of supported models. For models not on the list, you can try configuring Letta to use OpenAI proxy.
You can install the Letta provider with:
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}> <Tab> <Snippet text="pnpm add @letta-ai/vercel-ai-sdk-provider" dark /> </Tab> <Tab> <Snippet text="npm install @letta-ai/vercel-ai-sdk-provider" dark /> </Tab> <Tab> <Snippet text="yarn add @letta-ai/vercel-ai-sdk-provider" dark /> </Tab> <Tab> <Snippet text="bun add @letta-ai/vercel-ai-sdk-provider" dark /> </Tab> </Tabs>
You can import the provider instance lettaCloud or lettaLocal from @letta-ai/vercel-ai-sdk-provider:
// For cloud users
import { lettaCloud } from '@letta-ai/vercel-ai-sdk-provider';
// For self-hosted users
import { lettaLocal } from '@letta-ai/vercel-ai-sdk-provider';
// Create a custom Letta provider
import { createLetta } from '@letta-ai/vercel-ai-sdk-provider';
const letta = createLetta({
baseUrl: '<your-base-url>',
token: '<your-access-token>',
});
Get your API key from the Letta dashboard.
# .env
LETTA_API_KEY=your-letta-api-key
import { lettaCloud } from '@letta-ai/vercel-ai-sdk-provider';
import { generateText } from 'ai';
const result = await generateText({
model: lettaCloud(), // Model configuration (LLM, temperature, etc.) is managed through your Letta agent
providerOptions: {
letta: {
agent: { id: 'your-agent-id' },
},
},
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
console.log(result.text);
You can configure Letta-specific settings using the providerOptions.letta parameter:
import { lettaCloud } from '@letta-ai/vercel-ai-sdk-provider';
import { streamText } from 'ai';
const result = streamText({
model: lettaCloud(),
providerOptions: {
letta: {
agent: {
id: 'your-agent-id',
maxSteps: 100,
includePings: true,
streamTokens: true,
},
timeoutInSeconds: 300,
},
},
prompt: 'Tell me a story about a robot learning to paint',
});
Configure agent-specific parameters for message creation. These settings apply to both streaming and non-streaming operations.
Available options:
id (string, required): The ID of your Letta agentmaxSteps (number): Maximum number of agent execution stepsincludePings (boolean): Whether to include ping messages in the streamstreamTokens (boolean): Enable token-by-token streamingbackground (boolean): Enable background execution for long-running operationsType: number
Default: 1000
Set the maximum wait time (in seconds) for agent responses. This is important for long-running agent operations or when working with complex reasoning chains.
const result = streamText({
model: lettaCloud(),
providerOptions: {
letta: {
agent: { id: 'your-agent-id' },
timeoutInSeconds: 300, // Wait up to 5 minutes
},
},
prompt: 'Process this complex task...',
});
Letta agents support custom tools and MCP (Model Context Protocol) servers through provider-executed tools. Once configured on your Letta agent , you can include them in your requests using letta.tool() and Letta handles tool execution.
import { streamText } from 'ai';
import { z } from 'zod';
import { lettaCloud } from '@letta-ai/vercel-ai-sdk-provider';
// Use with streaming
const result = streamText({
model: lettaCloud(),
tools: {
web_search: lettaCloud.tool('web_search'),
memory_insert: lettaCloud.tool('memory_insert'),
memory_replace: lettaCloud.tool('memory_replace'),
core_memory_append: lettaCloud.tool('core_memory_append'),
my_custom_tool: lettaCloud.tool('my_custom_tool'),
// Optionally provide description and schema (placeholders only - execution handled by Letta)
typed_query: lettaCloud.tool('typed_query', {
description: 'Query with typed parameters',
inputSchema: z.object({
query: z.string(),
}),
}),
},
providerOptions: {
letta: {
agent: { id: agentId },
},
},
prompt: 'Tell me a story about a robot learning to paint',
});
The vercel-ai-sdk-provider extends the @letta-ai/letta-client, you can access the operations directly by using lettaCloud.client or lettaLocal.client or your custom generated letta.client
// with Letta Cloud
import { lettaCloud } from '@letta-ai/vercel-ai-sdk-provider';
lettaCloud.client.agents.list();
// with Letta Local
import { lettaLocal } from '@letta-ai/vercel-ai-sdk-provider';
lettaLocal.client.agents.list();
For more information on the Letta API, please refer to the Letta API documentation.