docs/providers/bedrock-mantle.md
OpenClaw includes a bundled Amazon Bedrock Mantle provider that connects to
the Mantle OpenAI-compatible endpoint. Mantle hosts open-source and
third-party models (GPT-OSS, Qwen, Kimi, GLM, and similar) through a standard
/v1/chat/completions surface backed by Bedrock infrastructure.
| Property | Value |
|---|---|
| Provider ID | amazon-bedrock-mantle |
| API | openai-completions (OpenAI-compatible) or anthropic-messages (Anthropic Messages route) |
| Auth | Explicit AWS_BEARER_TOKEN_BEDROCK or IAM credential-chain bearer-token generation |
| Default region | us-east-1 (override with AWS_REGION or AWS_DEFAULT_REGION) |
Choose your preferred auth method and follow the setup steps.
<Tabs> <Tab title="Explicit bearer token"> **Best for:** environments where you already have a Mantle bearer token.<Steps>
<Step title="Set the bearer token on the gateway host">
```bash
export AWS_BEARER_TOKEN_BEDROCK="..."
```
Optionally set a region (defaults to `us-east-1`):
```bash
export AWS_REGION="us-west-2"
```
</Step>
<Step title="Verify models are discovered">
```bash
openclaw models list
```
Discovered models appear under the `amazon-bedrock-mantle` provider. No
additional config is required unless you want to override defaults.
</Step>
</Steps>
<Steps>
<Step title="Configure AWS credentials on the gateway host">
Any AWS SDK-compatible auth source works:
```bash
export AWS_PROFILE="default"
export AWS_REGION="us-west-2"
```
</Step>
<Step title="Verify models are discovered">
```bash
openclaw models list
```
OpenClaw generates a Mantle bearer token from the credential chain automatically.
</Step>
</Steps>
<Tip>
When `AWS_BEARER_TOKEN_BEDROCK` is not set, OpenClaw mints the bearer token for you from the AWS default credential chain, including shared credentials/config profiles, SSO, web identity, and instance or task roles.
</Tip>
When AWS_BEARER_TOKEN_BEDROCK is set, OpenClaw uses it directly. Otherwise,
OpenClaw attempts to generate a Mantle bearer token from the AWS default
credential chain. It then discovers available Mantle models by querying the
region's /v1/models endpoint.
| Behavior | Detail |
|---|---|
| Discovery cache | Results cached for 1 hour |
| IAM token refresh | Hourly |
us-east-1, us-east-2, us-west-2, ap-northeast-1,
ap-south-1, ap-southeast-3, eu-central-1, eu-west-1, eu-west-2,
eu-south-1, eu-north-1, sa-east-1.
If you prefer explicit config instead of auto-discovery:
{
models: {
providers: {
"amazon-bedrock-mantle": {
baseUrl: "https://bedrock-mantle.us-east-1.api.aws/v1",
api: "openai-completions",
auth: "api-key",
apiKey: "env:AWS_BEARER_TOKEN_BEDROCK",
models: [
{
id: "gpt-oss-120b",
name: "GPT-OSS 120B",
reasoning: true,
input: ["text"],
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
contextWindow: 32000,
maxTokens: 4096,
},
],
},
},
},
}
When you pin an Anthropic Messages model on the Mantle provider, OpenClaw uses the `anthropic-messages` API surface instead of `openai-completions` for that model. Auth still comes from `AWS_BEARER_TOKEN_BEDROCK` (or the minted IAM bearer token).
```json5
{
models: {
providers: {
"amazon-bedrock-mantle": {
models: [
{
id: "claude-opus-4.7",
name: "Claude Opus 4.7",
api: "anthropic-messages",
reasoning: true,
input: ["text", "image"],
contextWindow: 1000000,
maxTokens: 32000,
},
],
},
},
},
}
```
Both providers share the same `AWS_BEARER_TOKEN_BEDROCK` credential when
present.