llama-index-integrations/llms/llama-index-llms-dashscope/README.md
Install the required Python package:
pip install llama-index-llms-dashscope
Set the DashScope API key as an environment variable:
export DASHSCOPE_API_KEY=YOUR_DASHSCOPE_API_KEY
Alternatively, you can set it in your Python script:
import os
os.environ["DASHSCOPE_API_KEY"] = "YOUR_DASHSCOPE_API_KEY"
To generate a basic vanilla cake recipe:
from llama_index.llms.dashscope import DashScope, DashScopeGenerationModels
# Initialize DashScope object
dashscope_llm = DashScope(model_name=DashScopeGenerationModels.QWEN_MAX)
# Generate a vanilla cake recipe
resp = dashscope_llm.complete("How to make cake?")
print(resp)
For real-time streamed responses:
responses = dashscope_llm.stream_complete("How to make cake?")
for response in responses:
print(response.delta, end="")
To have a conversation with the assistant and ask for a sugar-free cake recipe:
from llama_index.core.base.llms.types import MessageRole, ChatMessage
messages = [
ChatMessage(
role=MessageRole.SYSTEM, content="You are a helpful assistant."
),
ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]
# Get first round response
resp = dashscope_llm.chat(messages)
print(resp)
# Continue conversation
messages.append(
ChatMessage(role=MessageRole.ASSISTANT, content=resp.message.content)
)
messages.append(
ChatMessage(role=MessageRole.USER, content="How to make it without sugar?")
)
# Get second round response
resp = dashscope_llm.chat(messages)
print(resp)
For sugar-free cake recipes using honey as a sweetener:
resp = dashscope_llm.complete("How to make cake without sugar?")
print(resp)
https://docs.llamaindex.ai/en/stable/examples/llm/dashscope/