Back to Llama Index

OpenRouter

docs/examples/llm/openrouter.ipynb

0.14.211.8 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/openrouter.ipynb" target="_parent"></a>

OpenRouter

OpenRouter provides a standardized API to access many LLMs at the best price offered. You can find out more on their homepage.

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-openrouter
python
!pip install llama-index
python
from llama_index.llms.openrouter import OpenRouter
from llama_index.core.llms import ChatMessage

Call chat with ChatMessage List

You need to either set env var OPENROUTER_API_KEY or set api_key in the class constructor

python
# import os
# os.environ['OPENROUTER_API_KEY'] = '<your-api-key>'

llm = OpenRouter(
    api_key="<your-api-key>",
    max_tokens=256,
    context_window=4096,
    model="gryphe/mythomax-l2-13b",
)
python
message = ChatMessage(role="user", content="Tell me a joke")
resp = llm.chat([message])
print(resp)

Streaming

python
message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")

Call complete with Prompt

python
resp = llm.complete("Tell me a joke")
print(resp)
python
resp = llm.stream_complete("Tell me a story in 250 words")
for r in resp:
    print(r.delta, end="")

Model Configuration

python
# View options at https://openrouter.ai/models
# This example uses Mistral's MoE, Mixtral:
llm = OpenRouter(model="mistralai/mixtral-8x7b-instruct")
python
resp = llm.complete("Write a story about a dragon who can code in Rust")
print(resp)