docs/architecture/hooks/plugin-tool-injection.md
This document demonstrates how to use PicoClaw's hook system to implement external plugin tool injection, allowing LLM to call tools implemented by external hook processes.
Through the hook system's respond action, external hooks can:
before_llm, letting LLM know the tool is availablebefore_tool using respond action, skipping ToolRegistryThis way, external hooks can fully implement plugin tools without registering any tools inside PicoClaw.
Below is a complete Python hook example implementing a weather query plugin tool.
Save as /tmp/weather_plugin.py:
#!/usr/bin/env python3
"""Weather query plugin hook example"""
from __future__ import annotations
import json
import sys
import signal
from typing import Any
# Simulated weather data
WEATHER_DATA = {
"Beijing": {"temp": 15, "weather": "Sunny", "humidity": 45},
"Shanghai": {"temp": 18, "weather": "Cloudy", "humidity": 60},
"Guangzhou": {"temp": 25, "weather": "Sunny", "humidity": 70},
"Shenzhen": {"temp": 26, "weather": "Cloudy", "humidity": 75},
}
def get_weather(city: str) -> dict:
"""Get weather data (simulated)"""
data = WEATHER_DATA.get(city)
if data:
return {
"for_llm": f"{city} weather: {data['weather']}, temperature {data['temp']}°C, humidity {data['humidity']}%",
"for_user": "",
"silent": False,
"is_error": False,
}
return {
"for_llm": f"Weather data not found for city {city}",
"for_user": "",
"silent": False,
"is_error": True,
}
def handle_hello(params: dict) -> dict:
return {"ok": True, "name": "weather-plugin"}
def handle_before_llm(params: dict) -> dict:
"""Inject weather query tool definition"""
tools = params.get("tools", [])
# Add weather query tool
tools.append({
"type": "function",
"function": {
"name": "get_weather",
"description": "Query weather information for a specified city",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "City name, e.g.: Beijing, Shanghai, Guangzhou"
}
},
"required": ["city"]
}
}
})
return {
"action": "modify",
"request": {
"model": params.get("model"),
"messages": params.get("messages", []),
"tools": tools,
"options": params.get("options", {}),
}
}
def handle_before_tool(params: dict) -> dict:
"""Handle tool call, return result directly"""
tool = params.get("tool", "")
args = params.get("arguments", {})
if tool == "get_weather":
city = args.get("city", "")
result = get_weather(city)
# Use respond action to return result directly, skip ToolRegistry
return {
"action": "respond",
"result": result,
}
# Other tools continue normal flow
return {"action": "continue"}
def handle_request(method: str, params: dict) -> dict:
if method == "hook.hello":
return handle_hello(params)
if method == "hook.before_llm":
return handle_before_llm(params)
if method == "hook.before_tool":
return handle_before_tool(params)
if method == "hook.after_llm":
return {"action": "continue"}
if method == "hook.after_tool":
return {"action": "continue"}
if method == "hook.approve_tool":
return {"approved": True}
raise KeyError(f"method not found: {method}")
def send_response(message_id: int, result: Any | None = None, error: str | None = None) -> None:
payload: dict[str, Any] = {
"jsonrpc": "2.0",
"id": message_id,
}
if error is not None:
payload["error"] = {"code": -32000, "message": error}
else:
payload["result"] = result if result is not None else {}
sys.stdout.write(json.dumps(payload, ensure_ascii=True) + "\n")
sys.stdout.flush()
def main() -> int:
for raw_line in sys.stdin:
line = raw_line.strip()
if not line:
continue
try:
message = json.loads(line)
except json.JSONDecodeError:
continue
method = message.get("method")
message_id = message.get("id", 0)
params = message.get("params") or {}
if not message_id:
continue
try:
result = handle_request(str(method or ""), params)
send_response(int(message_id), result=result)
except KeyError as exc:
send_response(int(message_id), error=str(exc))
except Exception as exc:
send_response(int(message_id), error=f"unexpected error: {exc}")
return 0
if __name__ == "__main__":
signal.signal(signal.SIGINT, lambda *_: raise SystemExit(0))
signal.signal(signal.SIGTERM, lambda *_: raise SystemExit(0))
raise SystemExit(main())
Add hook configuration in the config file:
{
"hooks": {
"enabled": true,
"processes": {
"weather_plugin": {
"enabled": true,
"priority": 100,
"transport": "stdio",
"command": ["python3", "/tmp/weather_plugin.py"],
"intercept": ["before_llm", "before_tool"]
}
}
}
}
When user asks "What's the weather in Beijing today?":
hook.before_llm, hook injects get_weather tool definitionget_weather(city="Beijing")hook.before_tool, hook uses respond action to return weather dataUser: "What's the weather in Beijing today?"
↓
PicoClaw
↓
hook.before_llm
↓ (inject get_weather tool definition)
LLM request
↓
LLM decides to call get_weather(city="Beijing")
↓
hook.before_tool
↓ (respond action returns weather data)
Return result directly to LLM
↓ (skip ToolRegistry)
LLM replies: "Beijing is sunny today, temperature 15°C"
before_llm Inject Tool DefinitionTool definition follows OpenAI function calling format:
{
"type": "function",
"function": {
"name": "tool_name",
"description": "tool description",
"parameters": {
"type": "object",
"properties": {
"param_name": {
"type": "string",
"description": "parameter description"
}
},
"required": ["list of required parameters"]
}
}
}
before_tool Use respond Actionrespond action response format:
{
"action": "respond",
"result": {
"for_llm": "Content returned to LLM",
"for_user": "Optional, content sent to user",
"silent": false,
"is_error": false,
"media": ["Optional, media reference list"],
"response_handled": false
}
}
| Field | Description |
|---|---|
for_llm | Required, LLM will see this content |
for_user | Optional, sent directly to user |
silent | When true, not sent to user |
is_error | When true, indicates execution failure |
media | Optional, media file references (images, files, etc.) |
response_handled | When true, indicates user request is handled, turn will end |
The respond action supports returning media files (images, files, etc.). There are two processing modes:
response_handled=true)When response_handled=true, media files are automatically sent to the user and the turn ends:
{
"action": "respond",
"result": {
"for_llm": "Image sent to user",
"for_user": "",
"media": ["media://abc123"],
"response_handled": true
}
}
Use cases:
response_handled=false)When response_handled=false, media references are passed to the LLM, which can see the content in the next request:
{
"action": "respond",
"result": {
"for_llm": "Image loaded, path: /tmp/image.png [file:/tmp/image.png]",
"media": ["media://abc123"]
}
}
After seeing the content, the LLM can decide:
send_file tool to send to userMedia references use the media:// protocol:
media://<store-id>
These references are managed by PicoClaw's MediaStore and can be:
If the plugin generates files, you can return the file path and let the LLM call send_file or similar tools:
{
"action": "respond",
"result": {
"for_llm": "Image generated, saved at /tmp/generated_image.png. Use send_file tool to send to user.",
"for_user": "",
"silent": false
}
}
This approach:
Multiple tools can be injected simultaneously:
def handle_before_llm(params: dict) -> dict:
tools = params.get("tools", [])
# Tool 1: Weather query
tools.append({
"type": "function",
"function": {
"name": "get_weather",
"description": "Query city weather",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "City name"}
},
"required": ["city"]
}
}
})
# Tool 2: Calculator
tools.append({
"type": "function",
"function": {
"name": "calculate",
"description": "Perform mathematical calculations",
"parameters": {
"type": "object",
"properties": {
"expression": {"type": "string", "description": "Mathematical expression"}
},
"required": ["expression"]
}
}
})
return {
"action": "modify",
"request": {
"model": params.get("model"),
"messages": params.get("messages", []),
"tools": tools,
"options": params.get("options", {}),
}
}
def handle_before_tool(params: dict) -> dict:
tool = params.get("tool", "")
args = params.get("arguments", {})
if tool == "get_weather":
return {
"action": "respond",
"result": get_weather(args.get("city", "")),
}
if tool == "calculate":
# Simple calculation example
try:
expr = args.get("expression", "")
result = eval(expr) # Note: needs security handling in actual use
return {
"action": "respond",
"result": {
"for_llm": f"Calculation result: {result}",
"silent": False,
"is_error": False,
},
}
except Exception as e:
return {
"action": "respond",
"result": {
"for_llm": f"Calculation error: {e}",
"silent": False,
"is_error": True,
},
}
return {"action": "continue"}
Injected plugin tools coexist with PicoClaw built-in tools:
bash, read_file) execute normally through ToolRegistryrespond actionhandle_before_tool only handles plugin tools, other tools return continueIf you need to implement plugin tool injection in Go code:
package myhooks
import (
"context"
"github.com/sipeed/picoclaw/pkg/agent"
"github.com/sipeed/picoclaw/pkg/tools"
)
type WeatherPluginHook struct{}
func (h *WeatherPluginHook) BeforeLLM(
ctx context.Context,
req *agent.LLMHookRequest,
) (*agent.LLMHookRequest, agent.HookDecision, error) {
// Inject tool definition
req.Tools = append(req.Tools, agent.ToolDefinition{
Type: "function",
Function: agent.FunctionDefinition{
Name: "get_weather",
Description: "Query city weather",
Parameters: map[string]any{
"type": "object",
"properties": map[string]any{
"city": map[string]any{
"type": "string",
"description": "City name",
},
},
"required": []string{"city"},
},
},
})
return req, agent.HookDecision{Action: agent.HookActionContinue}, nil
}
func (h *WeatherPluginHook) BeforeTool(
ctx context.Context,
call *agent.ToolCallHookRequest,
) (*agent.ToolCallHookRequest, agent.HookDecision, error) {
if call.Tool == "get_weather" {
city := call.Arguments["city"].(string)
// Set HookResult, use respond action
next := call.Clone()
next.HookResult = &tools.ToolResult{
ForLLM: getWeatherData(city),
Silent: false,
IsError: false,
}
return next, agent.HookDecision{Action: agent.HookActionRespond}, nil
}
return call, agent.HookDecision{Action: agent.HookActionContinue}, nil
}
func getWeatherData(city string) string {
// Implement weather query logic
return fmt.Sprintf("%s weather: Sunny, temperature 20°C", city)
}
Through the hook system's respond action, external processes can:
This provides a flexible and elegant solution for plugin development.
Important: The respond action bypasses ApproveTool approval checks.
This means:
before_tool hook can return respond for any tool name, including sensitive tools (like bash)deny_tool for rejection: Use deny_tool action instead of respond with error for denying executiondef handle_before_tool(params: dict) -> dict:
tool = params.get("tool", "")
args = params.get("arguments", {})
# Security check: only handle plugin tools
if tool in ["get_weather", "calculate"]:
return {
"action": "respond",
"result": execute_plugin_tool(tool, args),
}
# Other tools continue normal flow (will go through approval)
return {"action": "continue"}
This ensures the hook only affects plugin tools, not system tool approval flow.