apps/docs/content/starter-kits/branching-chat.mdx
To build with a branching chat starter kit, run this command in your terminal:
npm create tldraw@latest -- --template branching-chat
Use the branching chat starter kit to build:
<StarterKitBento type="branching" />Each conversation message appears as a draggable node on the infinite canvas. The NodeShapeUtil class extends tldraw's shape system to create custom chat message containers. These containers dynamically resize based on content length and provide input fields for user messages, areas for AI responses, and connection ports for linking conversations. You can also create nodes directly from the toolbar.
The ConnectionBindingUtil manages relationships between message nodes and creates visual lines that represent conversation flow. You create connections by dragging between node ports to branch conversations and link context. When users send a message, the system traces backwards through connected nodes to build complete conversation history. This gives AI responses the full dialogue context.
The backend uses Cloudflare Workers with the Vercel AI SDK to stream responses from Google's Gemini API. As AI text generates, the streaming fetch implementation decodes response chunks and updates the tldraw document state in real time. Users see responses appear progressively within the connected node.
Nodes feature input and output ports that let users create branching dialogue structures by dragging connections between messages. You can build complex conversation trees where multiple messages feed into a single AI response or diverge into parallel branches. Connected nodes automatically establish context relationships across different paths in the dialogue graph.
This starter kit is built on top of tldraw's extensible architecture. You can customize everything. The canvas renders using React DOM, so you can use familiar React patterns, components, and state management across your conversation interface. Let's have a look at some ways to change this starter kit.
To create new types of conversation nodes beyond basic messages, you can extend the node system with custom node definitions. The system uses a pluggable architecture where each node type defines its own behavior, rendering, and port configuration.
See client/nodes/types/MessageNode.tsx as an example. This file shows how to define a complete node type with TypeScript validation, React component rendering, AI streaming integration, and dynamic sizing based on content length.
To integrate with different AI providers or modify response behavior, you can customize the streaming implementation and API endpoints. The system uses the Vercel AI SDK which supports multiple providers including OpenAI, Anthropic, and Google.
See worker/worker.ts as an example. This file demonstrates how to configure AI providers, handle streaming responses, and process conversation context from connected nodes.
To modify how conversation nodes look and behave, you can override the node rendering and styling system. Each node type has complete control over its visual presentation while maintaining integration with tldraw's interaction system.
See client/nodes/NodeShapeUtil.tsx as an example. This file shows how the shape utility defines node geometry, interaction behavior, and visual indicators including port positioning and selection bounds.
To modify how conversation flows connect and interact, you can customize the connection and binding system. This controls how nodes link together, how context flows between connected messages, and how the visual connections appear.
See client/connection/ConnectionBindingUtil.tsx as an example. This file demonstrates how to define binding behavior between shapes, including automatic cleanup when nodes are deleted and visual feedback during connection creation.
If you build something great, please share it with us in our #show-and-tell channel on Discord. We want to see what you've built!