llama-index-integrations/tools/llama-index-tools-desearch/examples/desearch.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-exa/examples/desearch.ipynb" target="_parent"></a>
This tutorial walks through using the LLM tools provided by the Desearch API to allow LLMs to use semantic queries to search for and retrieve rich web content from the internet.
To get started, you will need an Desearch API key
We will import the tools and pass them our keys here:
# # Install the relevant LlamaIndex packages, incl. core and Desearch tool
!pip install llama-index llama-index-core llama-index-tools-desearch
from llama_index_desearch.tools import DesearchToolSpec
import os
# Instantiate
desearch_tool = DesearchToolSpec(
api_key=os.environ["DESEARCH_API_KEY"],
)
# Get the list of tools to see what Desearch offers
exa_tool_list = desearch_tool.to_tool_list()
for tool in exa_tool_list:
print(tool.metadata.name)
ai_search_tool
twitter_search_tool
web_search_tool
We've imported our OpenAI agent, set up the API keys, and initialized our tool, checking the methods that it has available. Let's test out the tool before setting up our Agent.
All of the Desearch search tools make use of the AutoPrompt option where Desearch will pass the query through an LLM to refine it in line with Desearch query best-practice.
The Desearch API allows you to perform AI-powered web searches, gathering relevant information from multiple sources, including web pages, research papers, and social media discussions.
desearch_tool.ai_search_tool(
prompt="Bittensor",
tool=["web"],
model="NOVA",
date_filter="PAST_24_HOURS"
)
The X Search API enables users to retrieve relevant links and tweets based on specified search queries without utilizing AI-driven models. It analyzes links from X posts that align with the provided search criteria.
desearch_tool.twitter_search_tool(
query="bittensor",
sort="Top",
count=20,
)
This API allows users to search for any information over the web. This replicates a typical search engine experience, where users can search for any information they need.
desearch_tool.web_search_tool(
query="bittensor",
num=10,
start=0,
)
We now are ready to create an Agent that can use Exa's services to their full potential. We will use our wrapped read and load tools, as well as the get_date utility for the following agent and test it out below:
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.llms.openai import OpenAI
# Just pass the wrapped tools and the get_date utility
agent = FunctionAgent(
tools=[*wrapped_retrieve.to_tool_list(), date_tool],
llm=OpenAI(model="gpt-4.1"),
)
print(
await agent.run(
"Can you summarize everything published in the last month regarding news on superconductors"
)
)