Back to Llama Index

Connecting to a Vectara corpus about Electric Vehicles

llama-index-integrations/tools/llama-index-tools-vectara-query/README.md

0.14.211.7 KB
Original Source

Vectara Query Tool

This tool connects to a Vectara corpus and allows agents to make semantic search or retrieval augmented generation (RAG) queries.

Usage

Please note that this usage example relies on version >=0.3.0.

This tool has a more extensive example usage documented in a Jupyter notebok here

To use this tool, you'll need a Vectara account (If you don't have an account, you can create one here) and the following information in your environment:

  • VECTARA_CORPUS_KEY: The corpus key for the Vectara corpus that you want your tool to search for information. If you need help creating a corpus with your data, follow this Quick Start guide.
  • VECTARA_API_KEY: An API key that can perform queries on this corpus.

Here's an example usage of the VectaraQueryToolSpec.

python
from llama_index.tools.vectara_query import VectaraQueryToolSpec
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.llms.openai import OpenAI

# Connecting to a Vectara corpus about Electric Vehicles
tool_spec = VectaraQueryToolSpec()

agent = FunctionAgent(
    tools=tool_spec.to_tool_list(),
    llm=OpenAI(model="gpt-4.1"),
)

print(await agent.run("What are the different types of electric vehicles?"))

The available tools are:

semantic_search: A tool that accepts a query and uses semantic search to obtain the top search results.

rag_query: A tool that accepts a query and uses RAG to obtain a generative response grounded in the search results.