Back to Llama Index

Clarifai LLM

docs/examples/llm/clarifai.ipynb

0.14.212.4 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/clarifai.ipynb" target="_parent"></a>

Clarifai LLM

Example notebook to call different LLM models using Clarifai

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-clarifai
python
!pip install llama-index

Install clarifai

python
!pip install clarifai

Set clarifai PAT as environment variable.

python
import os

os.environ["CLARIFAI_PAT"] = "<YOUR CLARIFAI PAT>"

Import clarifai package

python
from llama_index.llms.clarifai import Clarifai

Explore various models according to your prefrence from Our Models page

python
# Example parameters
params = dict(
    user_id="clarifai",
    app_id="ml",
    model_name="llama2-7b-alternative-4k",
    model_url=(
        "https://clarifai.com/clarifai/ml/models/llama2-7b-alternative-4k"
    ),
)

Initialize the LLM

python
# Method:1 using model_url parameter
llm_model = Clarifai(model_url=params["model_url"])
python
# Method:2 using model_name, app_id & user_id parameters
llm_model = Clarifai(
    model_name=params["model_name"],
    app_id=params["app_id"],
    user_id=params["user_id"],
)

Call complete function

python
llm_reponse = llm_model.complete(
    prompt="write a 10 line rhyming poem about science"
)
python
print(llm_reponse)

Call chat function

python
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(role="user", content="write about climate change in 50 lines")
]
Response = llm_model.chat(messages)
python
print(Response)

Using Inference parameters

Alternatively you can call models with inference parameters.

python
# Here is an inference parameter example for GPT model.
inference_params = dict(temperature=str(0.3), max_tokens=20)
python
llm_reponse = llm_model.complete(
    prompt="What is nuclear fission and fusion?",
    inference_params=params,
)
python
messages = [ChatMessage(role="user", content="Explain about the big bang")]
Response = llm_model.chat(messages, inference_params=params)