Back to Open Interpreter

vLLM

docs/language-models/hosted-models/vllm.mdx

0.4.2645 B
Original Source

To use Open Interpreter with vLLM, you will need to:

  1. pip install vllm
  2. Set the api_base flag:
<CodeGroup>
bash
interpreter --api_base <https://your-hosted-vllm-server>
python
from interpreter import interpreter

interpreter.llm.api_base = "<https://your-hosted-vllm-server>"
interpreter.chat()
</CodeGroup>
  1. Set the model flag:
<CodeGroup>
bash
interpreter --model vllm/<vllm-model>
python
from interpreter import interpreter

interpreter.llm.model = "vllm/<vllm-model>"
interpreter.chat()
</CodeGroup>

Supported Models

All models from VLLM should be supported