Back to Mistral Rs

Phi 3 tool calling example with Duck Duck Go

examples/server/phi3_duckduckgo_mistral.rs.ipynb

0.8.03.4 KB
Original Source

Phi 3 tool calling example with Duck Duck Go

7/8/24

Credit to @joshuasundance-swca

This notebook is a first attempt at getting Phi3 to use tools, running with mistralrs.

bash
mistralrs serve -p 8099 --isq Q4K -m microsoft/Phi-3-mini-128k-instruct
bash
python -m pip install openai duckduckgo-search langchain langchain_community langchain_openai

It's pretty easy because mistralrs is openai-compatible, and langchain_openai.ChatOpenAI doesn't actually care where you send the requests.

python
from langchain.agents import AgentExecutor
from langchain.agents import create_json_chat_agent
from langchain_community.tools import DuckDuckGoSearchResults
from langchain_community.utilities import DuckDuckGoSearchAPIWrapper
from langchain_core.prompts import MessagesPlaceholder, ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough
from langchain_openai import ChatOpenAI


llm = ChatOpenAI(
    openai_api_base="http://localhost:8099/v1",
    openai_api_key="EMPTY",
    streaming=True,
    temperature=0.50,
    verbose=True,
)


ddg_wrapper = DuckDuckGoSearchAPIWrapper(max_results=5)
ddg_search = DuckDuckGoSearchResults(api_wrapper=ddg_wrapper)
tools = [ddg_search]


system_message_content = """You are Phi, a powerful AI agent built by Microsoft. In order to effectively serve the user, you have access to the following tools:

# TOOLS

{tools}

# RESPONSE INSTRUCTIONS

## RESPONSE OPTION 1 (TOOL USE)

Use this format to call tools.

Markdown code snippet formatted in the following schema:

```json
{{
    "action": string, \ The action to take. Must be one of {tool_names}
    "action_input": string \ The input to the action
}}

RESPONSE OPTION 2 (FINAL ANSWER)

Use this if you want to respond directly to the human. Markdown code snippet formatted in the following schema:

json
{{
    "action": "Final Answer",
    "action_input": string \ You should put what you want to return to use here
}}
```"""

human_message_content = """# USER INPUT

Here is the user's input (remember to respond with a markdown code snippet of a json blob with a single action, and NOTHING else):

----------------

{input}"""

prompt_template = ChatPromptTemplate.from_messages(
    [
        ("system", system_message_content),
        ("human", human_message_content),
        (
            "human",
            "Use `duckduckgo_results_json` to gather information before answering the question.",
        ),
        MessagesPlaceholder(variable_name="agent_scratchpad"),
    ]
)


agent = create_json_chat_agent(llm, tools, prompt_template)

agent_executor = AgentExecutor(
    agent=agent, tools=tools, verbose=True, handle_parsing_errors=True
)

ch = {"input": RunnablePassthrough()} | agent_executor | (lambda x: x["output"])
python
%%time

question = "What is `mistral.rs`?"
ch.invoke(question)
python
%%time

question = "What is Starcoder-2?"
ch.invoke(question)
python
%%time

question = "Today is July 8, 2024. Who won the most recent Super Bowl?"
ch.invoke(question)
python
%%time

question = "Who wrote the song, `If You've Got the Money, I've Got the Time`?"
ch.invoke(question)
python
%%time

questions = [
    f"What is the current weather in {place}"
    for place in (
        "Phoenix, AZ",
        "Portland, OR",
        "Raleigh, NC",
        "Orlando, FL",
        "Washington, DC",
    )
]

ch.batch(questions)