Back to Llama Index

LlamaIndex Embeddings Integration: Fireworks

llama-index-integrations/embeddings/llama-index-embeddings-fireworks/README.md

0.14.212.0 KB
Original Source

LlamaIndex Embeddings Integration: Fireworks

This integration provides support for Fireworks AI embedding models with LlamaIndex.

Fireworks AI offers fast and efficient inference for embedding models. Sign up at fireworks.ai to get an API key.

Installation

bash
pip install llama-index-embeddings-fireworks

Usage

python
from llama_index.embeddings.fireworks import FireworksEmbedding

# Initialize the embedding model
embed_model = FireworksEmbedding(
    api_key="your-api-key",  # or set FIREWORKS_API_KEY env var
    model_name="nomic-ai/nomic-embed-text-v1.5",
)

# Get embedding for a single text
embedding = embed_model.get_text_embedding("Hello, world!")

# Get embeddings for multiple texts
embeddings = embed_model.get_text_embedding_batch(
    ["Hello, world!", "How are you?"]
)

# Get query embedding
query_embedding = embed_model.get_query_embedding("What is machine learning?")

Configuration

ParameterDescriptionDefault
api_keyFireworks API key (or set FIREWORKS_API_KEY env var)None
model_nameEmbedding model to usenomic-ai/nomic-embed-text-v1.5
dimensionsOutput embedding dimensions (if supported by model)None
embed_batch_sizeBatch size for embedding requests10
timeoutRequest timeout in seconds60.0
max_retriesMaximum number of retries10

Environment Variables

  • FIREWORKS_API_KEY: Your Fireworks API key
  • FIREWORKS_API_BASE: Custom API base URL (optional)