.agents/skills/testing/references/agent-runtime-e2e.md
Only mock three external dependencies:
| Dependency | Mock | Description |
|---|---|---|
| Database | PGLite | In-memory database from @lobechat/database/test-utils |
| Redis | InMemoryAgentStateManager | Memory implementation |
| Redis | InMemoryStreamEventManager | Memory implementation |
NOT mocked:
model-bank - Uses real model configMecha (AgentToolsEngine, ContextEngineering)AgentRuntimeServiceAgentRuntimeCoordinatorDifferent tests need different LLM responses. vi.spyOn provides:
model-bankimport { LobeChatDatabase } from '@lobechat/database';
import { getTestDB } from '@lobechat/database/test-utils';
let testDB: LobeChatDatabase;
beforeEach(async () => {
testDB = await getTestDB();
});
export const createOpenAIStreamResponse = (options: {
content?: string;
toolCalls?: Array<{ id: string; name: string; arguments: string }>;
finishReason?: 'stop' | 'tool_calls';
}) => {
const { content, toolCalls, finishReason = 'stop' } = options;
return new Response(
new ReadableStream({
start(controller) {
const encoder = new TextEncoder();
if (content) {
const chunk = {
id: 'chatcmpl-mock',
object: 'chat.completion.chunk',
model: 'gpt-5',
choices: [{ index: 0, delta: { content }, finish_reason: null }],
};
controller.enqueue(encoder.encode(`data: ${JSON.stringify(chunk)}\n\n`));
}
// ... tool_calls handling
// ... finish chunk
controller.enqueue(encoder.encode('data: [DONE]\n\n'));
controller.close();
},
}),
{ headers: { 'content-type': 'text/event-stream' } },
);
};
import {
InMemoryAgentStateManager,
InMemoryStreamEventManager,
} from '@/server/modules/AgentRuntime';
const stateManager = new InMemoryAgentStateManager();
const streamEventManager = new InMemoryStreamEventManager();
const service = new AgentRuntimeService(serverDB, userId, {
coordinatorOptions: { stateManager, streamEventManager },
queueService: null,
streamEventManager,
});
const fetchSpy = vi.spyOn(globalThis, 'fetch');
it('should handle text response', async () => {
fetchSpy.mockResolvedValueOnce(createOpenAIStreamResponse({ content: 'Response text' }));
// ... execute test
});
it('should handle tool calls', async () => {
fetchSpy.mockResolvedValueOnce(
createOpenAIStreamResponse({
toolCalls: [
{
id: 'call_123',
name: 'lobe-web-browsing____search____builtin',
arguments: JSON.stringify({ query: 'weather' }),
},
],
finishReason: 'tool_calls',
}),
);
// ... execute test
});
InMemoryAgentStateManager and InMemoryStreamEventManager after each testDEBUG=lobe-server:* for detailed logs