docs/static/reference/php/Dagger/LLM.html
class LLM extends AbstractObject implements IdAble
| | $lastQuery | | from AbstractObject |
__construct(AbstractClient $client, QueryBuilderChain $queryBuilderChain)
No description
from AbstractObject
null|array|string|int|float|bool
queryLeaf(QueryBuilder $leafQueryBuilder, string $leafKey)
No description
from AbstractObject
attempt(int $number)
create a branch in the LLM's history
bindResult(string $name)
returns the type of the current state
env()
return the LLM's current environment
bool
Indicates whether there are any queued prompts or tool results to send to the model
array
history()
return the llm message history
return the raw llm message history as json
id()
A unique identifier for this LLM.
string
return the last llm reply from the history
loop()
Submit the queued prompt, evaluate any tool calls, queue their results, and keep going until the model ends its turn
string
model()
return the model used by the llm
string
provider()
return the provider used by the llm
step()
Submit the queued prompt or tool call results, evaluate any tool calls, and queue their results
sync()
synchronize LLM state
returns the token usage of the current state
string
tools()
print documentation for available tools
withBlockedFunction(string $typeName, string $function)
Return a new LLM with the specified function no longer exposed as a tool
withEnv(Env $env)
allow the LLM to interact with an environment via MCP
withMCPServer(string $name, Service $service)
Add an external MCP server to the LLM
withModel(string $model)
swap out the llm model
withPrompt(string $prompt)
append a prompt to the llm context
withPromptFile(File $file)
append the contents of a file to the llm context
Use a static set of tools for method calls, e.g. for MCP clients that do not support dynamic tool registration
withSystemPrompt(string $prompt)
Add a system prompt to the LLM's environment
Disable the default system prompt
Clear the message history, leaving only the system prompts
Clear the system prompts, leaving only the default system prompt
__construct(AbstractClient $client, QueryBuilderChain $queryBuilderChain) No description
| AbstractClient | $client | | | QueryBuilderChain | $queryBuilderChain | |
protected null|array|string|int|float|bool queryLeaf(QueryBuilder $leafQueryBuilder, string $leafKey)No description
| QueryBuilder | $leafQueryBuilder | | | string | $leafKey | |
| null|array|string|int|float|bool | |
LLM attempt(int $number)create a branch in the LLM's history
| int | $number | |
| LLM | |
Binding bindResult(string $name)returns the type of the current state
| string | $name | |
| Binding | |
Env env()return the LLM's current environment
| Env | |
bool hasPrompt()Indicates whether there are any queued prompts or tool results to send to the model
| bool | |
array history()return the llm message history
| array | |
Json historyJSON()return the raw llm message history as json
| Json | |
AbstractId id()A unique identifier for this LLM.
| AbstractId | |
string lastReply()return the last llm reply from the history
| string | |
LLM loop()Submit the queued prompt, evaluate any tool calls, queue their results, and keep going until the model ends its turn
| LLM | |
string model()return the model used by the llm
| string | |
string provider()return the provider used by the llm
| string | |
LLMId step()Submit the queued prompt or tool call results, evaluate any tool calls, and queue their results
| LLMId | |
LLMId sync()synchronize LLM state
| LLMId | |
LLMTokenUsage tokenUsage()returns the token usage of the current state
| LLMTokenUsage | |
string tools()print documentation for available tools
| string | |
LLM withBlockedFunction(string $typeName, string $function)Return a new LLM with the specified function no longer exposed as a tool
| string | $typeName | | | string | $function | |
| LLM | |
LLM withEnv(Env $env)allow the LLM to interact with an environment via MCP
| Env | $env | |
| LLM | |
LLM withMCPServer(string $name, Service $service)Add an external MCP server to the LLM
| string | $name | | | Service | $service | |
| LLM | |
LLM withModel(string $model)swap out the llm model
| string | $model | |
| LLM | |
LLM withPrompt(string $prompt)append a prompt to the llm context
| string | $prompt | |
| LLM | |
LLM withPromptFile(File $file)append the contents of a file to the llm context
| File | $file | |
| LLM | |
LLM withStaticTools()Use a static set of tools for method calls, e.g. for MCP clients that do not support dynamic tool registration
| LLM | |
LLM withSystemPrompt(string $prompt)Add a system prompt to the LLM's environment
| string | $prompt | |
| LLM | |
LLM withoutDefaultSystemPrompt()Disable the default system prompt
| LLM | |
LLM withoutMessageHistory()Clear the message history, leaving only the system prompts
| LLM | |
LLM withoutSystemPrompts()Clear the system prompts, leaving only the default system prompt
| LLM | |
Generated by Doctum, a API Documentation generator and fork of Sami.