Back to Llama Index

Import and initialize our tool spec

llama-index-integrations/tools/llama-index-tools-vectara-query/examples/vectara_query.ipynb

0.14.212.1 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-vectara-query/examples/vectara_query.ipynb" target="_parent"></a>

Vectara Query Tool

Please note that this example notebook is only for Vectara Query tool versions >=0.3.0

To get started with Vectara, sign up (if you haven't already) and follow our quickstart guide to create a corpus and an API key.

Once you have done this, add the following variables to your environment:

VECTARA_CORPUS_KEY: The corpus key for the Vectara corpus that you want your tool to search for information.

VECTARA_API_KEY: An API key that can perform queries on this corpus.

You are now ready to use the Vectara query tool.

To initialize the tool, provide your Vectara information and any query parameters that you want to adjust, such as the reranker, summarizer prompt, etc. To see the entire list of parameters, see the VectaraQueryToolSpec class definition.

python
# Import and initialize our tool spec
# pip install -U llama-index-tools-vectara-query

from llama_index.tools.vectara_query.base import VectaraQueryToolSpec

tool_spec = VectaraQueryToolSpec()

After initializing the tool spec, we can provide it to our agent. For this notebook, we will use the OpenAI Agent, but our tool can be used with any type of agent. You will need your own OpenAI API key to run this notebook.

python
# Setup OpenAI Agent
import os
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.llms.openai import OpenAI

agent = FunctionAgent(
    tools=tool_spec.to_tool_list(),
    llm=OpenAI(model="gpt-4.1"),
)

print(await agent.run("What are the different types of electric vehicles?"))