docs/agentic-tools/ai-sdk/getting-started.mdx
@upstash/context7-tools-ai-sdk provides Vercel AI SDK compatible tools and agents that give your AI applications access to up-to-date library documentation.
When building AI-powered applications with the Vercel AI SDK, your models often need accurate information about libraries and frameworks. Instead of relying on potentially outdated training data, Context7 tools let your AI fetch current documentation on-demand, ensuring responses include correct API usage, current best practices, and working code examples.
The package gives you two ways to integrate:
resolveLibraryId and queryDocs) that you add to your existing generateText or streamText callsContext7Agent) that handles the entire documentation lookup workflow automaticallyBoth approaches work with any AI provider supported by the Vercel AI SDK, including OpenAI, Anthropic, Google, and others.
pnpm add @upstash/context7-tools-ai-sdk
yarn add @upstash/context7-tools-ai-sdk
bun add @upstash/context7-tools-ai-sdk
You'll need:
@ai-sdk/openai, @ai-sdk/anthropic)Set your Context7 API key as an environment variable:
CONTEXT7_API_KEY=ctx7sk-...
The tools and agents will automatically use this key.
The simplest way to add documentation lookup to your AI application:
import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk";
import { generateText, stepCountIs } from "ai";
import { openai } from "@ai-sdk/openai";
const { text } = await generateText({
model: openai("gpt-5.2"),
prompt: "How do I create a server action in Next.js?",
tools: {
resolveLibraryId: resolveLibraryId(),
queryDocs: queryDocs(),
},
stopWhen: stepCountIs(5),
});
console.log(text);
For a more streamlined experience, use the pre-configured agent:
import { Context7Agent } from "@upstash/context7-tools-ai-sdk";
import { anthropic } from "@ai-sdk/anthropic";
const agent = new Context7Agent({
model: anthropic("claude-sonnet-4-20250514"),
});
const { text } = await agent.generate({
prompt: "How do I use React Server Components?",
});
console.log(text);
For streaming responses:
import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk";
import { streamText, stepCountIs } from "ai";
import { openai } from "@ai-sdk/openai";
const { textStream } = streamText({
model: openai("gpt-5.2"),
prompt: "Explain how to use Tanstack Query for data fetching",
tools: {
resolveLibraryId: resolveLibraryId(),
queryDocs: queryDocs(),
},
stopWhen: stepCountIs(5),
});
for await (const chunk of textStream) {
process.stdout.write(chunk);
}
You can also pass the API key directly if needed:
import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk";
const tools = {
resolveLibraryId: resolveLibraryId({ apiKey: "your-api-key" }),
queryDocs: queryDocs({ apiKey: "your-api-key" }),
};
The tools follow a two-step workflow:
resolveLibraryId - Searches Context7's database to find the correct library ID for a given query (e.g., "react" → /reactjs/react.dev)
queryDocs - Fetches documentation for the resolved library using the user's query to retrieve relevant content
The AI model orchestrates these tools automatically based on the user's prompt, fetching relevant documentation before generating a response.