Back to Llama Index

DashScope LLMS

docs/examples/llm/dashscope.ipynb

0.14.212.5 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/dashscope.ipynb" target="_parent"></a>

DashScope LLMS

In this notebook, we show how to use the DashScope LLM models in LlamaIndex. Check out the DashScope site or the documents.

If you're opening this Notebook on colab, you will need to install LlamaIndex 🦙 and the DashScope Python SDK.

python
!pip install llama-index-llms-dashscope

Basic Usage

You will need to login DashScope an create a API. Once you have one, you can either pass it explicitly to the API, or use the DASHSCOPE_API_KEY environment variable.

python
%env DASHSCOPE_API_KEY=YOUR_DASHSCOPE_API_KEY
python
import os

os.environ["DASHSCOPE_API_KEY"] = "YOUR_DASHSCOPE_API_KEY"

Initialize DashScope Object

python
from llama_index.llms.dashscope import DashScope, DashScopeGenerationModels

dashscope_llm = DashScope(model_name=DashScopeGenerationModels.QWEN_MAX)

Call complete with a prompt

python
resp = dashscope_llm.complete("How to make cake?")
print(resp)

Call `stream_complete`` with a prompt

python
responses = dashscope_llm.stream_complete("How to make cake?")
for response in responses:
    print(response.delta, end="")

Call chat with a list of messages

python
from llama_index.core.base.llms.types import MessageRole, ChatMessage

messages = [
    ChatMessage(
        role=MessageRole.SYSTEM, content="You are a helpful assistant."
    ),
    ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]
resp = dashscope_llm.chat(messages)
print(resp)

Using stream_chat

python
responses = dashscope_llm.stream_chat(messages)
for response in responses:
    print(response.delta, end="")

Multiple rounds conversation.

python
messages = [
    ChatMessage(
        role=MessageRole.SYSTEM, content="You are a helpful assistant."
    ),
    ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]
# first round
resp = dashscope_llm.chat(messages)
print(resp)

# add response to messages.
messages.append(
    ChatMessage(role=MessageRole.ASSISTANT, content=resp.message.content)
)

messages.append(
    ChatMessage(role=MessageRole.USER, content="How to make it without sugar")
)
# second round
resp = dashscope_llm.chat(messages)
print(resp)