examples/bedrock-provider-example/README.md
This example demonstrates how to use the Bedrock LLM with different model providers, including support for Nova models and inference profiles.
amazon.nova-lite-v1:0)us.amazon.nova-lite-v1:0)# Using default Titan model
go run main.go
# Using Nova model
go run main.go -model "amazon.nova-lite-v1:0"
# Using inference profile
go run main.go -model "us.amazon.nova-lite-v1:0"
# Using Anthropic model with explicit provider (for edge cases)
go run main.go -model "us.anthropic.claude-3-7-sonnet-20250219-v1:0" -provider "anthropic"
# Custom prompt
go run main.go -prompt "What is the capital of France?"
# Verbose output
go run main.go -verbose
Set these environment variables before running:
export AWS_ACCESS_KEY_ID=your_key_id
export AWS_SECRET_ACCESS_KEY=your_secret_key
export AWS_REGION=us-east-1
The Bedrock integration automatically detects the provider from the model ID:
.nova- (e.g., amazon.nova-lite-v1:0, us.amazon.nova-pro-v1:0)anthropic (e.g., anthropic.claude-3-sonnet-20240229-v1:0)amazon (excluding Nova)meta (e.g., meta.llama3-1-405b-instruct-v1:0)cohere (e.g., cohere.command-r-plus-v1:0)ai21 (e.g., ai21.jamba-1-5-large-v1:0)For cases where automatic detection doesn't work correctly (e.g., custom inference endpoints), you can explicitly specify the provider:
llm, err := bedrock.New(
bedrock.WithModel("custom.endpoint.model-id"),
bedrock.WithModelProvider("anthropic"), // Explicitly set provider
)