docs/docs/en/ai-employees/workflow/nodes/llm/structured-output.md
In some application scenarios, users may want the LLM model to respond with structured content in JSON format. This can be achieved by configuring "Structured Output".
The way a model generates structured content depends on the model used and its Response format configuration:
Models where Response format only supports text
Models where Response format supports JSON mode (json_object)
Models where Response format supports JSON Schema (json_schema)
Ollama local models
format parameter to the model when called.The structured content of the model's response is saved as a JSON object in the node's Structured content field and can be used by subsequent nodes.