Back to Open Interpreter

Google (Vertex AI)

docs/language-models/hosted-models/vertex-ai.mdx

0.4.21.3 KB
Original Source

Pre-requisites

  • pip install google-cloud-aiplatform
  • Authentication:
    • run gcloud auth application-default login See Google Cloud Docs
    • Alternatively you can set application_default_credentials.json

To use Open Interpreter with Google's Vertex AI API, set the model flag:

<CodeGroup>
bash
interpreter --model gemini-pro
interpreter --model gemini-pro-vision
python
from interpreter import interpreter

interpreter.llm.model = "gemini-pro"
interpreter.llm.model = "gemini-pro-vision"
interpreter.chat()
</CodeGroup>

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
VERTEXAI_PROJECTThe Google Cloud project ID.Google Cloud Console
VERTEXAI_LOCATIONThe location of your Vertex AI resources.Google Cloud Console

Supported Models

  • gemini-pro
  • gemini-pro-vision
  • chat-bison-32k
  • chat-bison
  • chat-bison@001
  • codechat-bison
  • codechat-bison-32k
  • codechat-bison@001