Back to Llama Index

CometAPI

docs/examples/llm/cometapi.ipynb

0.14.212.1 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/cometapi.ipynb" target="_parent"></a>

CometAPI

CometAPI provides access to various state-of-the-art LLM models including GPT series, Claude series, Gemini series, and more through a unified OpenAI-compatible interface. You can find out more on their homepage.

Visit https://api.cometapi.com/console/token to sign up and get an API key.

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-cometapi
python
%pip install llama-index
python
from llama_index.llms.cometapi import CometAPI

Call chat with ChatMessage List

You need to either set env var COMETAPI_API_KEY or set api_key in the class constructor

python
import os

os.environ["COMETAPI_KEY"] = "<your-cometapi-key>"

api_key = os.getenv("COMETAPI_KEY")
llm = CometAPI(
    api_key=api_key,
    max_tokens=256,
    context_window=4096,
    model="gpt-5-chat-latest",
)
python
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(role="system", content="You are a helpful assistant"),
    ChatMessage(role="user", content="Say 'Hi' only!"),
]
resp = llm.chat(messages)
print(resp)
python
resp = llm.complete("Who is Kaiming He")
python
print(resp)

Streaming

Using stream_complete endpoint

python
message = ChatMessage(role="user", content="Tell me what ResNet is")
resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")
python
resp = llm.stream_complete("Tell me about Large Language Models")
python
for r in resp:
    print(r.delta, end="")

Using Different Models

CometAPI supports various AI models including GPT, Claude, and Gemini series.

python
# Using Claude model
claude_llm = CometAPI(
    api_key=api_key, model="claude-3-7-sonnet-latest", max_tokens=200
)

resp = claude_llm.complete("Explain deep learning briefly")
print(resp)