content/providers/01-ai-sdk-providers/30-deepseek.mdx
The DeepSeek provider offers access to powerful language models through the DeepSeek API.
API keys can be obtained from the DeepSeek Platform.
The DeepSeek provider is available via the @ai-sdk/deepseek module. You can install it with:
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}> <Tab> <Snippet text="pnpm add @ai-sdk/deepseek" dark /> </Tab> <Tab> <Snippet text="npm install @ai-sdk/deepseek" dark /> </Tab> <Tab> <Snippet text="yarn add @ai-sdk/deepseek" dark /> </Tab> <Tab> <Snippet text="bun add @ai-sdk/deepseek" dark /> </Tab> </Tabs>
You can import the default provider instance deepseek from @ai-sdk/deepseek:
import { deepseek } from '@ai-sdk/deepseek';
For custom configuration, you can import createDeepSeek and create a provider instance with your settings:
import { createDeepSeek } from '@ai-sdk/deepseek';
const deepseek = createDeepSeek({
apiKey: process.env.DEEPSEEK_API_KEY ?? '',
});
You can use the following optional settings to customize the DeepSeek provider instance:
baseURL string
Use a different URL prefix for API calls.
The default prefix is https://api.deepseek.com.
apiKey string
API key that is being sent using the Authorization header. It defaults to
the DEEPSEEK_API_KEY environment variable.
headers Record<string,string>
Custom headers to include in the requests.
fetch (input: RequestInfo, init?: RequestInit) => Promise<Response>
Custom fetch implementation.
You can create language models using a provider instance:
import { deepseek } from '@ai-sdk/deepseek';
import { generateText } from 'ai';
const { text } = await generateText({
model: deepseek('deepseek-chat'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
You can also use the .chat() or .languageModel() factory methods:
const model = deepseek.chat('deepseek-chat');
// or
const model = deepseek.languageModel('deepseek-chat');
DeepSeek language models can be used in the streamText function
(see AI SDK Core).
The following optional provider options are available for DeepSeek models:
thinking object
Optional. Controls thinking mode (chain-of-thought reasoning). You can enable thinking mode either by using the deepseek-reasoner model or by setting this option.
type: 'enabled' | 'disabled' - Enable or disable thinking mode.import { deepseek, type DeepSeekLanguageModelOptions } from '@ai-sdk/deepseek';
import { generateText } from 'ai';
const { text, reasoning } = await generateText({
model: deepseek('deepseek-chat'),
prompt: 'How many "r"s are in the word "strawberry"?',
providerOptions: {
deepseek: {
thinking: { type: 'enabled' },
} satisfies DeepSeekLanguageModelOptions,
},
});
DeepSeek has reasoning support for the deepseek-reasoner model. The reasoning is exposed through streaming:
import { deepseek } from '@ai-sdk/deepseek';
import { streamText } from 'ai';
const result = streamText({
model: deepseek('deepseek-reasoner'),
prompt: 'How many "r"s are in the word "strawberry"?',
});
for await (const part of result.fullStream) {
if (part.type === 'reasoning') {
// This is the reasoning text
console.log('Reasoning:', part.text);
} else if (part.type === 'text') {
// This is the final answer
console.log('Answer:', part.text);
}
}
See AI SDK UI: Chatbot for more details on how to integrate reasoning into your chatbot.
DeepSeek provides context caching on disk technology that can significantly reduce token costs for repeated content. You can access the cache hit/miss metrics through the providerMetadata property in the response:
import { deepseek } from '@ai-sdk/deepseek';
import { generateText } from 'ai';
const result = await generateText({
model: deepseek('deepseek-chat'),
prompt: 'Your prompt here',
});
console.log(result.providerMetadata);
// Example output: { deepseek: { promptCacheHitTokens: 1856, promptCacheMissTokens: 5 } }
The metrics include:
promptCacheHitTokens: Number of input tokens that were cachedpromptCacheMissTokens: Number of input tokens that were not cached| Model | Text Generation | Object Generation | Image Input | Tool Usage | Tool Streaming |
|---|---|---|---|---|---|
deepseek-chat | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> |
deepseek-reasoner | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Check size={18} /> | <Check size={18} /> |