llama-index-integrations/tools/llama-index-tools-tavily-research/examples/tavily.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-tavily-research/examples/tavily.ipynb" target="_parent"></a>
This tutorial walks through using the LLM tools provided by the Tavily API to allow LLMs to easily search and retrieve relevant content from the Internet.
To get started, you will need an OpenAI api key and a Tavily API key
We will import the relevant agents and tools and pass them our keys here:
%pip install llama-index-tools-tavily-research llama-index
# set your openai key, if using openai
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["TAVILY_API_KEY"] = "..."
# Set up Tavily tool
from llama_index.tools.tavily_research.base import TavilyToolSpec
tavily_tool = TavilyToolSpec(
api_key="tvly-api-key",
)
tavily_tool_list = tavily_tool.to_tool_list()
for tool in tavily_tool_list:
print(tool.metadata.name)
We've imported our OpenAI agent, set up the api key, and initialized our tool. Let's test out the tool before setting up our Agent.
tavily_tool.search("What happened in the latest Burning Man festival?", max_results=3)
The extract function allows you to extract raw content from specific URLs. This is useful when you have specific URLs you want to extract content from, rather than searching for content.
# Extract content from specific URLs
urls_to_extract = [
"https://en.wikipedia.org/wiki/Burning_Man",
"https://burningman.org/about/",
]
extracted_docs = tavily_tool.extract(
urls=urls_to_extract,
include_images=False,
include_favicon=True,
extract_depth="basic",
format="markdown",
)
print(f"Extracted {len(extracted_docs)} documents:")
for i, doc in enumerate(extracted_docs):
print(f"\nDocument {i+1}:")
print(f"URL: {doc.extra_info.get('url', 'N/A')}")
print(f"Content preview: {doc.text[:300]}...")
print(f"Has favicon: {doc.extra_info.get('favicon') is not None}")
We can create an agent with access to the Tavily search tool start testing it out:
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.llms.openai import OpenAI
agent = FunctionAgent(
tools=tavily_tool_list,
llm=OpenAI(model="gpt-4o"),
)
print(
await agent.run(
"Write a deep analysis in markdown syntax about the latest burning man floods"
)
)