Back to Copilotkit

Tool Rendering

showcase/shell-docs/src/content/docs/integrations/llamaindex/generative-ui/tool-rendering.mdx

1.57.03.4 KB
Original Source

<IframeSwitcher id="backend-tools-example" exampleUrl="https://feature-viewer.copilotkit.ai/llama-index/feature/backend_tool_rendering?sidebar=false&chatDefaultOpen=false" codeUrl="https://feature-viewer.copilotkit.ai/llama-index/feature/backend_tool_rendering?view=code&sidebar=false&codeLayout=tabs" exampleLabel="Demo" codeLabel="Code" height="700px" />

<Callout> This example demonstrates the [implementation](#implementation) section applied in the{" "} <a href="https://feature-viewer.copilotkit.ai/llama-index/feature/agentic_chat" target="_blank" > CopilotKit feature viewer </a> . </Callout>

What is this?

Tools are a way for the LLM to call predefined, typically, deterministic functions. CopilotKit allows you to render these tools in the UI as a custom component, which we call Generative UI.

When should I use this?

Rendering tools in the UI is useful when you want to provide the user with feedback about what your agent is doing, specifically when your agent is calling tools. CopilotKit allows you to fully customize how these tools are rendered in the chat.

Implementation

<Steps> <Step> ### Run and connect your agent <RunAndConnect /> </Step> <Step> ### Give your agent a tool to call

Add a new tool definition and pass the tool to the agent:

python
from fastapi import FastAPI
from llama_index.llms.openai import OpenAI
from llama_index.protocols.ag_ui.router import get_ag_ui_workflow_router

def getWeather(location: str) -> str:
    """Get the weather for a given location."""
    return f"The weather in {location} is sunny and 70 degrees."

# Initialize the LLM
llm = OpenAI(model="gpt-5.4")

# Create the AG-UI workflow router
agentic_chat_router = get_ag_ui_workflow_router(
    llm=llm,
    # These are the tools that only have a render function in the frontend
    backend_tools=[getWeather],
    system_prompt="You are a helpful AI assistant with access to various tools and capabilities.",
)

# Create FastAPI app
app = FastAPI(
    title="LlamaIndex Agent",
    description="A LlamaIndex agent integrated with CopilotKit",
    version="1.0.0"
)

# Include the router
app.include_router(agentic_chat_router)

# Health check endpoint
@app.get("/health")
async def health_check():
    return {"status": "healthy", "agent": "llamaindex"}

if __name__ == "__main__":
    uvicorn.run(app, host="localhost", port=8000)
</Step> <Step> ### Render the tool call in your frontend At this point, your agent will be able to call the `get_weather` tool. Now we just need to add a `useRenderTool` hook to render the tool call in the UI. <Callout type="info" title="Important"> In order to render a tool call in the UI, the name of the action must match the name of the tool. </Callout>
tsx
// ...

const YourMainContent = () => {
  // ...
  // [!code highlight:12]
  useRenderTool({
    name: "getWeather",
    render: ({ status, args }) => {
      return (
        <p className="text-gray-500 mt-2">
          {status !== "complete" && "Calling weather API..."}
          {status === "complete" &&
            `Called the weather API for ${args.location}.`}
        </p>
      );
    },
  });
  // ...
};
</Step> <Step> ### Give it a try!

Try asking the agent to get the weather for a location. You should see the custom UI component that we added render the tool call and display the arguments that were passed to the tool.

</Step> </Steps>

Default Tool Rendering

<DefaultToolRendering />