llama-index-integrations/embeddings/llama-index-embeddings-alephalpha/README.md
This README provides an overview of integrating Aleph Alpha's semantic embeddings with LlamaIndex. Aleph Alpha's API enables the generation of semantic embeddings from text, which can be used for downstream tasks such as semantic similarity and models like classifiers.
symmetric, document, and query embeddings based on your use case.pip install llama-index-embeddings-alephalpha
from llama_index.embeddings.alephalpha import AlephAlphaEmbedding
Request Parameters:
model: Model name (e.g., luminous-base). The latest model version is used.representation: Type of embedding (symmetric, document, query).prompt: Text or multimodal prompt to embed. Supports text strings or an array of multimodal items.compress_to_size: Optional compression to 128 dimensions.normalize: Set to true for normalized embeddings.Advanced Parameters:
hosting: Datacenter processing option (aleph-alpha for maximal data privacy).contextual_control_threshold, control_log_additive: Control attention parameters for advanced use cases.model_version: Model name and version used for inference.embedding: List of floats representing the generated embedding.num_tokens_prompt_total: Total number of tokens in the input prompt.See the example notebook for a detailed walkthrough of using Aleph Alpha embeddings with LlamaIndex.
For more detailed API documentation and available models, visit Aleph Alpha's API Docs.