Back to Tensorzero

Guide: How to use OpenAI-compatible embedding models (e.g. Ollama) with TensorZero

examples/guides/embeddings/providers/openai-compatible-ollama/README.md

2026.4.1598 B
Original Source

Guide: How to use OpenAI-compatible embedding models (e.g. Ollama) with TensorZero

Running the Example

  1. Launch the Ollama server:
bash
ollama serve
  1. Download an embedding model:
bash
ollama pull nomic-embed-text
  1. Launch the TensorZero Gateway:
bash
docker compose up
  1. Run the example (in a separate terminal):
<details open> <summary><b>Python (OpenAI SDK)</b></summary>

a. Install the Python dependencies. We recommend using uv:

bash
uv sync

b. Run the example:

bash
uv run openai_sdk.py
</details>