Back to Llama Index

Friendli

docs/examples/llm/friendli.ipynb

0.14.211.9 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/friendli.ipynb" target="_parent"></a>

Friendli

Basic Usage

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index-llms-friendli
python
!pip install llama-index
python
%env FRIENDLI_TOKEN=...
python
from llama_index.llms.friendli import Friendli

# To customize your friendli token, do this
# otherwise it will lookup FRIENDLI_TOKEN from your env variable
# llm = Friendli(friendli_token="Your personal access token")

llm = Friendli()

Call chat with a list of messages

python
from llama_index.core.llms import ChatMessage, MessageRole

message = ChatMessage(role=MessageRole.USER, content="Tell me a joke.")
resp = llm.chat([message])

print(resp)

Streaming

python
resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")

Async

python
resp = await llm.achat([message])

print(resp)

Async Streaming

python
resp = await llm.astream_chat([message])
async for r in resp:
    print(r.delta, end="")

Call complete with a prompt

python
prompt = "Draft a cover letter for a role in software engineering."
resp = llm.complete(prompt)

print(resp)

Streaming

python
resp = llm.stream_complete(prompt)
for r in resp:
    print(r.delta, end="")

Async

python
resp = await llm.acomplete(prompt)

print(resp)

Async Streaming

python
resp = await llm.astream_complete(prompt)
async for r in resp:
    print(r.delta, end="")

Configure Model

python
from llama_index.llms.friendli import Friendli

llm = Friendli(model="llama-2-70b-chat")
python
resp = llm.chat([message])

print(resp)