docs/gateway/create-a-prompt-template.mdx
Prompt templates and schemas simplify engineering iteration, experimentation, and optimization, especially as application complexity and team size grow. Notably, they enable you to:
You can also find a complete runnable example for this guide on GitHub.
</Tip>Create a file with your MiniJinja template:
Share a fun fact about: {{ topic }}
TensorZero uses the MiniJinja templating language. MiniJinja is mostly compatible with Jinja2, which is used by many popular projects like Flask and Django.
</Note> <Tip>MiniJinja provides a browser playground where you can test your templates.
</Tip> </Step> <Step title="Configure a template">Next, you must declare the template in the variant configuration.
You can do this by adding the field templates.your_template_name.path to your variant with a path to your template file.
For example, let's configure a template called fun_fact_topic for our variant:
[functions.fun_fact]
type = "chat"
[functions.fun_fact.variants.gpt_5_mini]
type = "chat_completion"
model = "openai::gpt-5-mini"
templates.fun_fact_topic.path = "functions/fun_fact/gpt_5_mini/fun_fact_topic_template.minijinja" # relative to this file
You can configure multiple templates for a variant.
</Tip> </Step> <Step title="Use your template during inference">Use your template during inference by sending a tensorzero::template content block with the template name and arguments.
from openai import OpenAI
client = OpenAI(base_url="http://localhost:3000/openai/v1", api_key="not-used")
result = client.chat.completions.create(
model="tensorzero::function_name::fun_fact",
messages=[
{
"role": "user",
"content": [
{
"type": "tensorzero::template", # type: ignore
"name": "fun_fact_topic",
"arguments": {"topic": "artificial intelligence"},
}
],
},
],
)
When you have multiple variants for a function, it becomes challenging to ensure all templates use consistent variable names and types. Schemas solve this by defining a contract that validates template variables and catches configuration errors before they reach production. Defining a schema is optional but recommended.
<Steps> <Step title="Create a schema">Create a JSON Schema for the variables used by your templates.
Let's define a schema for our previous example, which includes only a single variable topic:
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"topic": {
"type": "string"
}
},
"required": ["topic"],
"additionalProperties": false
}
LLMs are great at generating JSON Schemas. For example, the schema above was generated with the following request:
Generate a JSON schema with a single field: `topic`.
The `topic` field is required. No additional fields are allowed.
You can also export JSON Schemas from Pydantic models and Zod schemas.
</Tip> </Step> <Step title="Configure a schema">Then, declare your schema in your function definition using schemas.your_schema_name.path.
This will ensure that every variant for the function has a template named your_schema_name.
In our example above, this would mean updating the function definition to:
[functions.fun_fact]
type = "chat"
schemas.fun_fact_topic.path = "functions/fun_fact/fun_fact_topic_schema.json" # relative to this file // [!code ++]
[functions.fun_fact.variants.gpt_5_mini]
type = "chat_completion"
model = "openai::gpt-5-mini"
templates.fun_fact_topic.path = "functions/fun_fact/gpt_5_mini/fun_fact_topic_template.minijinja" # relative to this file
You can enable template file system access to reuse shared snippets in your prompts.
To use the MiniJinja directives {% include %} and {% import %}, set gateway.template_filesystem_access.base_path in your configuration.
See Organize your configuration for details.
In earlier versions of TensorZero, prompt templates were defined as system_template, user_template, and assistant_template.
Similarly, template schemas were defined as system_schema, user_schema, and assistant_schema.
This legacy approach limited the flexibility of prompt templates restricting the ability to define multiple templates per role.
As you create new functions and templates, you should use the new templates.your_template_name.path format.
Historical observability data stored in your database still uses the legacy format. If you want to keep this data forward-compatible (e.g. for fine-tuning), you can update your configuration as follows:
| Legacy Configuration | Updated Configuration |
|---|---|
system_template | templates.system.path |
system_schema | schemas.system.path |
user_template | templates.user.path |
user_schema | schemas.user.path |
assistant_template | templates.assistant.path |
assistant_schema | schemas.assistant.path |
As we deprecate the legacy format, TensorZero will automatically look for templates and schemas in the new format for your historical data.