Back to Llama Index

Databricks

docs/examples/llm/databricks.ipynb

0.14.212.2 KB
Original Source

Databricks

Integrate with Databricks LLMs APIs.

Pre-requisites

Setup

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
% pip install llama-index-llms-databricks
python
!pip install llama-index
python
from llama_index.llms.databricks import Databricks
bash
export DATABRICKS_TOKEN=<your api key>
export DATABRICKS_SERVING_ENDPOINT=<your api serving endpoint>

Alternatively, you can pass your API key and serving endpoint to the LLM when you init it:

python
llm = Databricks(
    model="databricks-dbrx-instruct",
    api_key="your_api_key",
    api_base="https://[your-work-space].cloud.databricks.com/serving-endpoints/",
)

A list of available LLM models can be found here.

python
response = llm.complete("Explain the importance of open source LLMs")
python
print(response)

Call chat with a list of messages

python
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)
python
print(resp)

Streaming

Using stream_complete endpoint

python
response = llm.stream_complete("Explain the importance of open source LLMs")
python
for r in response:
    print(r.delta, end="")

Using stream_chat endpoint

python
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = llm.stream_chat(messages)
python
for r in resp:
    print(r.delta, end="")