showcase/shell-docs/src/content/docs/premium/headless-ui.mdx
CopilotKit offers fully headless UI through the useCopilotChatHeadless_c hook. By using this hook, you can build your own chat interfaces from the ground up while still utilizing CopilotKit's core features and ease-of-use.
<video src="https://cdn.copilotkit.ai/docs/copilotkit/videos/full-headless-chat.mp4" style={{ width: "100%", borderRadius: "0.5rem", marginBottom: "1rem" }} loop playsInline autoPlay muted />
<Steps> <Step> ### Create a new applicationScaffold a new CopilotKit project using the CLI:
npx copilotkit@latest create
Wrap your root layout with CopilotKitProvider and pass in your public license key:
<CopilotKitProvider publicLicenseKey="your-free-public-license-key">
{children}
</CopilotKitProvider>
Use useCopilotChatHeadless_c to access messages, a send function, and loading state, then wire them to your own UI:
"use client";
import { useState } from "react";
import { useCopilotChatHeadless_c } from "@copilotkit/react-core/v2";
export default function Home() {
const { messages, sendMessage, isLoading } = useCopilotChatHeadless_c();
const [input, setInput] = useState("");
const handleSend = () => {
if (input.trim()) {
sendMessage({
id: Date.now().toString(),
role: "user",
content: input,
});
setInput("");
}
};
return (
<div>
<h1>My Headless Chat</h1>
<div>
{messages.map((message) => (
<div key={message.id}>
<strong>{message.role === "user" ? "You" : "Assistant"}:</strong>
<p>{message.content}</p>
</div>
))}
{isLoading && <p>Assistant is typing...</p>}
</div>
<div>
<input
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyDown={(e) => e.key === "Enter" && handleSend()}
placeholder="Type your message here..."
/>
<button onClick={handleSend} disabled={isLoading}>
Send
</button>
</div>
</div>
);
}
You can render generative UI either via useFrontendTool / useComponent, or by reading tools and rendering them directly.
useFrontendToolRegister a frontend tool and attach a render function. CopilotKit inserts your component wherever that tool call appears in the message stream:
import { useFrontendTool } from "@copilotkit/react-core/v2";
export const Chat = () => {
const { messages } = useCopilotChatHeadless_c();
useFrontendTool({
name: "showCustomComponent",
handler: () => "Foo, Bar, Baz",
render: ({ result, args, status }) => (
<div
style={{
backgroundColor: "red",
padding: "10px",
borderRadius: "5px",
}}
>
<p>Custom component</p>
<p>Result: {result}</p>
<p>Args: {JSON.stringify(args)}</p>
<p>Status: {status}</p>
</div>
),
});
return (
<div>
{messages.map((message) => (
<p key={message.id}>
{message.role === "user" ? "User: " : "Assistant: "}
{message.content}
{message.role === "assistant" && message.generativeUI?.()}
</p>
))}
</div>
);
};
If you don't want to use useFrontendTool, render the raw data directly:
export const Chat = () => {
const { messages } = useCopilotChatHeadless_c();
return (
<div>
{messages.map((message) => (
<p key={message.id}>
{message.role === "assistant" &&
message.toolCalls?.map((toolCall) => (
<p key={toolCall.id}>
{toolCall.function.name}: {toolCall.function.arguments}
</p>
))}
</p>
))}
</div>
);
};
CopilotKit's suggestions give users a list of generated or static prompts. The headless hook exposes full control over the lifecycle.
Use useCopilotChatSuggestions to generate and display prompt suggestions in your headless UI:
import {
useCopilotChatHeadless_c,
useCopilotChatSuggestions,
} from "@copilotkit/react-core/v2";
import { useEffect } from "react";
export const Chat = () => {
useCopilotChatSuggestions({
instructions:
"Suggest 5 interesting activities for programmers to do on their next vacation",
maxSuggestions: 5,
});
const { suggestions, generateSuggestions, sendMessage } =
useCopilotChatHeadless_c();
useEffect(() => {
generateSuggestions();
}, []);
return (
<div>
{suggestions.map((s, i) => (
<button
key={i}
onClick={() =>
sendMessage({
id: Date.now().toString(),
role: "user",
content: s.message,
})
}
>
{s.title}
</button>
))}
</div>
);
};
If you want deterministic control, set suggestions manually:
const { suggestions, setSuggestions } = useCopilotChatHeadless_c();
useEffect(() => {
setSuggestions([
{ title: "Suggestion 1", message: "The actual message for suggestion 1" },
{ title: "Suggestion 2", message: "The actual message for suggestion 2" },
]);
}, []);
Human-in-the-loop (HITL) pauses the chat and waits for the user's input. It comes in two flavors: tool-based and interrupt-based (certain frameworks only).
Tool-based HITL pauses tool execution until the user responds. The response becomes the tool's result.
import {
useFrontendTool,
useCopilotChatHeadless_c,
} from "@copilotkit/react-core/v2";
export const Chat = () => {
const { messages } = useCopilotChatHeadless_c();
useFrontendTool({
name: "getName",
renderAndWaitForResponse: ({ respond, args, status }) => {
if (status === "complete") return <div>Name retrieved…</div>;
return (
<div>
<input
value={args.name || ""}
onChange={(e) => respond?.(e.target.value)}
placeholder="Enter your name"
/>
<button onClick={() => respond?.(args.name)}>Submit</button>
</div>
);
},
});
return (
<>
{messages.map((message) => (
<p key={message.id}>
{message.role === "user" ? "User: " : "Assistant: "}
{message.content}
{message.role === "assistant" && message.generativeUI?.()}
</p>
))}
</>
);
};