docs/examples/llm/openrouter.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/openrouter.ipynb" target="_parent"></a>
OpenRouter provides a standardized API to access many LLMs at the best price offered. You can find out more on their homepage.
If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
%pip install llama-index-llms-openrouter
!pip install llama-index
from llama_index.llms.openrouter import OpenRouter
from llama_index.core.llms import ChatMessage
chat with ChatMessage ListYou need to either set env var OPENROUTER_API_KEY or set api_key in the class constructor
# import os
# os.environ['OPENROUTER_API_KEY'] = '<your-api-key>'
llm = OpenRouter(
api_key="<your-api-key>",
max_tokens=256,
context_window=4096,
model="gryphe/mythomax-l2-13b",
)
message = ChatMessage(role="user", content="Tell me a joke")
resp = llm.chat([message])
print(resp)
message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message])
for r in resp:
print(r.delta, end="")
complete with Promptresp = llm.complete("Tell me a joke")
print(resp)
resp = llm.stream_complete("Tell me a story in 250 words")
for r in resp:
print(r.delta, end="")
# View options at https://openrouter.ai/models
# This example uses Mistral's MoE, Mixtral:
llm = OpenRouter(model="mistralai/mixtral-8x7b-instruct")
resp = llm.complete("Write a story about a dragon who can code in Rust")
print(resp)