Back to Llama Index

DeepSeek

docs/examples/llm/deepseek.ipynb

0.14.212.0 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/deepseek.ipynb" target="_parent"></a>

DeepSeek

LlamaIndex Llms Integration: DeepSeek

This is the DeepSeek integration for LlamaIndex. Visit DeepSeek for information on how to get an API key and which models are supported.

At the time of writing, you can use:

  • deepseek-chat
  • deepseek-reasoner

Setup

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-deepseek
python
from llama_index.llms.deepseek import DeepSeek

# you can also set DEEPSEEK_API_KEY in your environment variables
llm = DeepSeek(model="deepseek-reasoner", api_key="you_api_key")

# You might also want to set deepseek as your default llm
# from llama_index.core import Settings
# Settings.llm = llm
python
response = llm.complete("Is 9.9 or 9.11 bigger?")
python
print(response)

Call chat with a list of messages

python
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(
        role="user", content="How many 'r's are in the word 'strawberry'?"
    ),
]
resp = llm.chat(messages)
python
print(resp)

Streaming

Using stream_complete endpoint

python
response = llm.stream_complete("Is 9.9 or 9.11 bigger?")
python
for r in response:
    print(r.delta, end="")

Using stream_chat endpoint

python
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(
        role="user", content="How many 'r's are in the word 'strawberry'?"
    ),
]
resp = llm.stream_chat(messages)
python
for r in resp:
    print(r.delta, end="")