clients/new-js/packages/ai-embeddings/ollama/README.md
This package provides an Ollama embedding provider for Chroma, allowing you to use locally hosted Ollama models.
npm install @chroma-core/ollama
import { ChromaClient } from 'chromadb';
import { OllamaEmbeddingFunction } from '@chroma-core/ollama';
// Initialize the embedder
const embedder = new OllamaEmbeddingFunction({
url: 'http://localhost:11434', // Default Ollama server URL
model: 'chroma/all-minilm-l6-v2-f32', // Default model
});
// Create a new ChromaClient
const client = new ChromaClient({
path: 'http://localhost:8000',
});
// Create a collection with the embedder
const collection = await client.createCollection({
name: 'my-collection',
embeddingFunction: embedder,
});
// Add documents
await collection.add({
ids: ["1", "2", "3"],
documents: ["Document 1", "Document 2", "Document 3"],
});
// Query documents
const results = await collection.query({
queryTexts: ["Sample query"],
nResults: 2,
});
ollama serve
ollama pull chroma/all-minilm-l6-v2-f32
http://localhost:11434)chroma/all-minilm-l6-v2-f32)Popular embedding models available through Ollama:
chroma/all-minilm-l6-v2-f32 (default, 384 dimensions)nomic-embed-text (768 dimensions)mxbai-embed-large (1024 dimensions)snowflake-arctic-embedPull models using:
ollama pull <model-name>
This package works in both Node.js and browser environments, automatically detecting the runtime and using the appropriate Ollama client.