Back to Open Interpreter

Perplexity

docs/language-models/hosted-models/perplexity.mdx

0.4.22.5 KB
Original Source

To use Open Interpreter with the Perplexity API, set the model flag:

<CodeGroup>
bash
interpreter --model perplexity/<perplexity-model>
python
from interpreter import interpreter

interpreter.llm.model = "perplexity/<perplexity-model>"
interpreter.chat()
</CodeGroup>

Supported Models

We support the following completion models from the Perplexity API:

  • pplx-7b-chat
  • pplx-70b-chat
  • pplx-7b-online
  • pplx-70b-online
  • codellama-34b-instruct
  • llama-2-13b-chat
  • llama-2-70b-chat
  • mistral-7b-instruct
  • openhermes-2-mistral-7b
  • openhermes-2.5-mistral-7b
  • pplx-7b-chat-alpha
  • pplx-70b-chat-alpha
<CodeGroup>
bash
interpreter --model perplexity/pplx-7b-chat
interpreter --model perplexity/pplx-70b-chat
interpreter --model perplexity/pplx-7b-online
interpreter --model perplexity/pplx-70b-online
interpreter --model perplexity/codellama-34b-instruct
interpreter --model perplexity/llama-2-13b-chat
interpreter --model perplexity/llama-2-70b-chat
interpreter --model perplexity/mistral-7b-instruct
interpreter --model perplexity/openhermes-2-mistral-7b
interpreter --model perplexity/openhermes-2.5-mistral-7b
interpreter --model perplexity/pplx-7b-chat-alpha
interpreter --model perplexity/pplx-70b-chat-alpha
python
interpreter.llm.model = "perplexity/pplx-7b-chat"
interpreter.llm.model = "perplexity/pplx-70b-chat"
interpreter.llm.model = "perplexity/pplx-7b-online"
interpreter.llm.model = "perplexity/pplx-70b-online"
interpreter.llm.model = "perplexity/codellama-34b-instruct"
interpreter.llm.model = "perplexity/llama-2-13b-chat"
interpreter.llm.model = "perplexity/llama-2-70b-chat"
interpreter.llm.model = "perplexity/mistral-7b-instruct"
interpreter.llm.model = "perplexity/openhermes-2-mistral-7b"
interpreter.llm.model = "perplexity/openhermes-2.5-mistral-7b"
interpreter.llm.model = "perplexity/pplx-7b-chat-alpha"
interpreter.llm.model = "perplexity/pplx-70b-chat-alpha"
</CodeGroup>

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
PERPLEXITYAI_API_KEYThe Perplexity API key from pplx-apiPerplexity API Settings