cookbook/liteLLM_VertextAI_Example.ipynb
!pip install litellm==0.1.388
Vertex AI requires the following:
vertex_project - Your Project IDvertex_location - Your Vertex AI region
Both can be found on: https://console.cloud.google.com/VertexAI uses Application Default Credentials, see https://cloud.google.com/docs/authentication/external/set-up-adc for more information on setting this up
NOTE: VertexAI requires you to set application_default_credentials.json, this can be set by running gcloud auth application-default login in your terminal
# set you Vertex AI configs
import litellm
from litellm import completion
litellm.vertex_project = "hardy-device-386718"
litellm.vertex_location = "us-central1"
user_message = "what is liteLLM "
messages = [{ "content": user_message,"role": "user"}]
# chat-bison or chat-bison@001 supported by Vertex AI (As of Aug 2023)
response = completion(model="chat-bison", messages=messages)
print(response)
print(litellm.vertex_text_models)
user_message = "what is liteLLM "
messages = [{ "content": user_message,"role": "user"}]
# text-bison or text-bison@001 supported by Vertex AI (As of Aug 2023)
response = completion(model="text-bison@001", messages=messages)
print(response)
response = completion(model="text-bison", messages=messages)
print(response)
response = completion(model="text-bison@001", messages=messages, temperature=0.4, top_k=10, top_p=0.2)
print(response['choices'][0]['message']['content'])