docs/examples/llm/deepseek.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/deepseek.ipynb" target="_parent"></a>
This is the DeepSeek integration for LlamaIndex. Visit DeepSeek for information on how to get an API key and which models are supported.
At the time of writing, you can use:
deepseek-chatdeepseek-reasonerIf you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
%pip install llama-index-llms-deepseek
from llama_index.llms.deepseek import DeepSeek
# you can also set DEEPSEEK_API_KEY in your environment variables
llm = DeepSeek(model="deepseek-reasoner", api_key="you_api_key")
# You might also want to set deepseek as your default llm
# from llama_index.core import Settings
# Settings.llm = llm
response = llm.complete("Is 9.9 or 9.11 bigger?")
print(response)
chat with a list of messagesfrom llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(
role="user", content="How many 'r's are in the word 'strawberry'?"
),
]
resp = llm.chat(messages)
print(resp)
Using stream_complete endpoint
response = llm.stream_complete("Is 9.9 or 9.11 bigger?")
for r in response:
print(r.delta, end="")
Using stream_chat endpoint
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(
role="user", content="How many 'r's are in the word 'strawberry'?"
),
]
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")