docs/examples/agent/mistral_agent.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/agent/mistral_agent.ipynb" target="_parent"></a>
This notebook shows you how to use our Mistral agent, powered by function calling capabilities.
Let's start by importing some simple building blocks.
The main thing we need is:
llama_index LLM class)If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
%pip install llama-index
%pip install llama-index-llms-mistralai
%pip install llama-index-embeddings-mistralai
Let's define some very simple calculator tools for our agent.
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
def add(a: int, b: int) -> int:
"""Add two integers and returns the result integer"""
return a + b
Make sure your MISTRAL_API_KEY is set. Otherwise explicitly specify the api_key parameter.
from llama_index.llms.mistralai import MistralAI
llm = MistralAI(model="mistral-large-latest", api_key="...")
Here we initialize a simple Mistral agent with calculator functions.
from llama_index.core.agent.workflow import FunctionAgent
agent = FunctionAgent(
tools=[multiply, add],
llm=llm,
)
response = await agent.run("What is (121 + 2) * 5?")
print(str(response))
# inspect sources
print(response.tool_calls)
By default, .run() is stateless. If you want to maintain state, you can pass in a context object.
from llama_index.core.workflow import Context
ctx = Context(agent)
response = await agent.run("My name is John Doe", ctx=ctx)
response = await agent.run("What is my name?", ctx=ctx)
print(str(response))
Build a Mistral agent over a simple 10K document. We use both Mistral embeddings and mistral-medium to construct the RAG pipeline, and pass it to the Mistral agent as a tool.
!mkdir -p 'data/10k/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/10k/uber_2021.pdf' -O 'data/10k/uber_2021.pdf'
from llama_index.core.tools import QueryEngineTool
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
from llama_index.embeddings.mistralai import MistralAIEmbedding
from llama_index.llms.mistralai import MistralAI
embed_model = MistralAIEmbedding(api_key="...")
query_llm = MistralAI(model="mistral-medium", api_key="...")
# load data
uber_docs = SimpleDirectoryReader(
input_files=["./data/10k/uber_2021.pdf"]
).load_data()
# build index
uber_index = VectorStoreIndex.from_documents(
uber_docs, embed_model=embed_model
)
uber_engine = uber_index.as_query_engine(similarity_top_k=3, llm=query_llm)
query_engine_tool = QueryEngineTool.from_defaults(
query_engine=uber_engine,
name="uber_10k",
description=(
"Provides information about Uber financials for year 2021. "
"Use a detailed plain text question as input to the tool."
),
)
from llama_index.core.agent.workflow import FunctionAgent
agent = FunctionAgent(tools=[query_engine_tool], llm=llm)
response = await agent.run(
"Tell me both the risk factors and tailwinds for Uber? Do two parallel tool calls."
)
print(str(response))