Back to Litellm

Using LiteLLM with Petals

cookbook/LiteLLM_Petals.ipynb

1.84.0-dev.2627 B
Original Source

Using LiteLLM with Petals

python
!pip install litellm # 0.1.715 and upwards
python
# install petals
!pip install git+https://github.com/bigscience-workshop/petals

petals-team/StableBeluga2

python
from litellm import completion

response = completion(model="petals/petals-team/StableBeluga2", messages=[{ "content": "Hello, how are you?","role": "user"}], max_tokens=50)

print(response)

huggyllama/llama-65b

python
response = completion(model="petals/huggyllama/llama-65b", messages=[{ "content": "Hello, how are you?","role": "user"}], temperature=0.2, max_tokens=10)

print(response)