Back to Llama Index

Function Calling Anthropic Agent

docs/examples/agent/anthropic_agent.ipynb

0.14.214.2 KB
Original Source

<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/agent/mistral_agent.ipynb" target="_parent"></a>

Function Calling Anthropic Agent

This notebook shows you how to use our Anthropic agent, powered by function calling capabilities.

NOTE: Only claude-3* models support function calling using Anthropic's API.

Initial Setup

Let's start by importing some simple building blocks.

The main thing we need is:

  1. the Anthropic API (using our own llama_index LLM class)
  2. a place to keep conversation history
  3. a definition for tools that our agent can use.

If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.

python
%pip install llama-index
%pip install llama-index-llms-anthropic
%pip install llama-index-embeddings-openai

Let's define some very simple calculator tools for our agent.

python
def multiply(a: int, b: int) -> int:
    """Multiple two integers and returns the result integer"""
    return a * b


def add(a: int, b: int) -> int:
    """Add two integers and returns the result integer"""
    return a + b

Make sure your ANTHROPIC_API_KEY is set. Otherwise explicitly specify the api_key parameter.

python
from llama_index.llms.anthropic import Anthropic

llm = Anthropic(model="claude-3-opus-20240229", api_key="sk-...")

Initialize Anthropic Agent

Here we initialize a simple Anthropic agent with calculator functions.

python
from llama_index.core.agent.workflow import FunctionAgent

agent = FunctionAgent(
    tools=[multiply, add],
    llm=llm,
)
python
from llama_index.core.agent.workflow import ToolCallResult


async def run_agent_verbose(query: str):
    handler = agent.run(query)
    async for event in handler.stream_events():
        if isinstance(event, ToolCallResult):
            print(
                f"Called tool {event.tool_name} with args {event.tool_kwargs}\nGot result: {event.tool_output}"
            )

    return await handler

Chat

python
response = await run_agent_verbose("What is (121 + 2) * 5?")
print(str(response))
python
# inspect sources
print(response.tool_calls)

Managing Context/Memory

By default, .run() is stateless. If you want to maintain state, you can pass in a context object.

python
from llama_index.core.workflow import Context

ctx = Context(agent)

response = await agent.run("My name is John Doe", ctx=ctx)
response = await agent.run("What is my name?", ctx=ctx)

print(str(response))

Anthropic Agent over RAG Pipeline

Build a Anthropic agent over a simple 10K document. We use OpenAI embeddings and claude-3-haiku-20240307 to construct the RAG pipeline, and pass it to the Anthropic Opus agent as a tool.

python
!mkdir -p 'data/10k/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/10k/uber_2021.pdf' -O 'data/10k/uber_2021.pdf'
python
from llama_index.core.tools import QueryEngineTool
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.llms.anthropic import Anthropic

embed_model = OpenAIEmbedding(
    model_name="text-embedding-3-large", api_key="sk-proj-..."
)
query_llm = Anthropic(model="claude-3-haiku-20240307", api_key="sk-...")

# load data
uber_docs = SimpleDirectoryReader(
    input_files=["./data/10k/uber_2021.pdf"]
).load_data()

# build index
uber_index = VectorStoreIndex.from_documents(
    uber_docs, embed_model=embed_model
)
uber_engine = uber_index.as_query_engine(similarity_top_k=3, llm=query_llm)
query_engine_tool = QueryEngineTool.from_defaults(
    query_engine=uber_engine,
    name="uber_10k",
    description=(
        "Provides information about Uber financials for year 2021. "
        "Use a detailed plain text question as input to the tool."
    ),
)
python
from llama_index.core.agent.workflow import FunctionAgent

agent = FunctionAgent(tools=[query_engine_tool], llm=llm, verbose=True)
python
response = await agent.run(
    "Tell me both the risk factors and tailwinds for Uber?"
)
print(str(response))