docs/enterprise-solutions/monitoring/prompt-storage.mdx
Prompt Storage allows enterprises to automatically back up Cline conversation history to cloud storage (AWS S3 or Cloudflare R2). This provides a centralized repository for compliance, audit trails, and usage analysis while maintaining local storage as the primary source of truth.
Every Cline task conversation is stored locally in ~/.cline/data/tasks/<taskId>/api_conversation_history.json. When prompt storage is enabled, a background sync worker automatically uploads these conversation files to your configured S3 or R2 bucket.
graph LR
A[User] --> B[Cline Extension]
B --> C[Local Storage
~/.cline/data/tasks/]
C --> D[Background Sync Worker]
D --> E[S3/R2 Bucket]
E --> F[Compliance/Analytics]
Prompt storage uploads the following files from each task:
| File | Content | Purpose |
|---|---|---|
api_conversation_history.json | Full conversation in Anthropic MessageParam format | Core conversation data for analysis |
| Task metadata | Task ID, timestamps, model info | Correlation and indexing |
Prompt storage does not include:
Files are uploaded to your bucket following this structure:
s3://your-bucket/tasks/{taskId}/api_conversation_history.json
This mirrors the local storage structure, making it easy to correlate local and cloud data.
Prompt storage is configured through Remote Configuration in the enterpriseTelemetry.promptUploading section.
{
"enterpriseTelemetry": {
"promptUploading": {
"enabled": true,
"type": "s3_access_keys",
"s3AccessSettings": {
"bucket": "your-cline-prompts",
"accessKeyId": "AKIAIOSFODNN7EXAMPLE",
"secretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
"region": "us-east-1",
"intervalMs": 30000,
"maxRetries": 5,
"batchSize": 10,
"maxQueueSize": 1000,
"maxFailedAgeMs": 604800000,
"backfillEnabled": false
}
}
}
}
| Field | Type | Required | Description |
|---|---|---|---|
enabled | boolean | Yes | Enable/disable prompt storage |
type | string | Yes | Storage type: "s3_access_keys" or "r2_access_keys" |
| Field | Type | Required | Description | Default |
|---|---|---|---|---|
bucket | string | Yes | S3/R2 bucket name | - |
accessKeyId | string | Yes | AWS/Cloudflare access key ID | - |
secretAccessKey | string | Yes | AWS/Cloudflare secret access key | - |
region | string | S3 only | AWS region (e.g., us-east-1) | - |
endpoint | string | R2 only | Cloudflare R2 endpoint URL | - |
accountId | string | R2 only | Cloudflare account ID | - |
| Field | Type | Description | Default |
|---|---|---|---|
intervalMs | number | Milliseconds between sync attempts | 30000 (30s) |
maxRetries | number | Maximum retries before giving up | 5 |
batchSize | number | Items to process per interval | 10 |
maxQueueSize | number | Maximum queue size before eviction | 1000 |
maxFailedAgeMs | number | Time before discarding failed items | 604800000 (7 days) |
backfillEnabled | boolean | Sync existing tasks on startup | false |
<Steps>
<Step title="Create S3 Bucket">
Create a dedicated S3 bucket for Cline conversation storage:
```bash
aws s3 mb s3://your-cline-prompts --region us-east-1
```
Enable versioning and encryption:
```bash
aws s3api put-bucket-versioning \
--bucket your-cline-prompts \
--versioning-configuration Status=Enabled
aws s3api put-bucket-encryption \
--bucket your-cline-prompts \
--server-side-encryption-configuration '{
"Rules": [{
"ApplyServerSideEncryptionByDefault": {
"SSEAlgorithm": "AES256"
}
}]
}'
```
</Step>
<Step title="Create IAM Policy">
Create an IAM policy with minimal required permissions:
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": "arn:aws:s3:::your-cline-prompts/*"
},
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::your-cline-prompts"
}
]
}
```
Save this as `cline-prompt-storage-policy.json` and create the policy:
```bash
aws iam create-policy \
--policy-name ClinePromptStorage \
--policy-document file://cline-prompt-storage-policy.json
```
</Step>
<Step title="Create IAM User">
Create a dedicated IAM user and attach the policy:
```bash
aws iam create-user --user-name cline-prompt-uploader
aws iam attach-user-policy \
--user-name cline-prompt-uploader \
--policy-arn arn:aws:iam::YOUR_ACCOUNT_ID:policy/ClinePromptStorage
aws iam create-access-key --user-name cline-prompt-uploader
```
Save the `AccessKeyId` and `SecretAccessKey` from the output.
</Step>
<Step title="Configure in Cline Dashboard">
In the Cline admin console at [app.cline.bot](https://app.cline.bot):
1. Navigate to **Settings** → **Enterprise Telemetry**
2. Enable **Prompt Uploading**
3. Select **S3** as the storage type
4. Enter your bucket name, access key ID, secret key, and region
5. Configure sync worker settings (or use defaults)
6. Save configuration
</Step>
<Step title="Test Connection">
Use the "Test Connection" button in the admin console to verify:
- Bucket access
- Write permissions
- Credential validity
A test file will be uploaded and deleted from your bucket.
</Step>
</Steps>
### Optional: Lifecycle Policies
Configure retention policies for cost management:
```json
{
"Rules": [
{
"Id": "ArchiveOldPrompts",
"Status": "Enabled",
"Transitions": [
{
"Days": 90,
"StorageClass": "GLACIER"
}
]
},
{
"Id": "DeleteOldPrompts",
"Status": "Enabled",
"Expiration": {
"Days": 2555
}
}
]
}
```
<Steps>
<Step title="Create R2 Bucket">
1. Log in to the [Cloudflare Dashboard](https://dash.cloudflare.com)
2. Navigate to **R2** in the sidebar
3. Click **Create bucket**
4. Name your bucket (e.g., `cline-prompts`)
5. Select a location close to your users
6. Click **Create bucket**
</Step>
<Step title="Generate API Token">
1. In the R2 dashboard, click **Manage R2 API Tokens**
2. Click **Create API token**
3. Configure permissions:
- **Token name**: Cline Prompt Storage
- **Permissions**: Object Read & Write
- **Bucket**: Select your bucket or use All buckets
4. Click **Create API Token**
5. Save the **Access Key ID** and **Secret Access Key**
6. Note your **Account ID** (shown in the R2 overview)
</Step>
<Step title="Get R2 Endpoint">
Your R2 endpoint follows this format:
```
https://<ACCOUNT_ID>.r2.cloudflarestorage.com
```
Find your account ID in the Cloudflare dashboard under R2 overview.
</Step>
<Step title="Configure in Cline Dashboard">
In the Cline admin console at [app.cline.bot](https://app.cline.bot):
1. Navigate to **Settings** → **Enterprise Telemetry**
2. Enable **Prompt Uploading**
3. Select **R2** as the storage type
4. Enter:
- Bucket name
- Access key ID
- Secret access key
- Account ID
- Endpoint URL
5. Configure sync worker settings (or use defaults)
6. Save configuration
</Step>
<Step title="Test Connection">
Use the "Test Connection" button to verify:
- Bucket access with provided credentials
- Write permissions
- Endpoint connectivity
</Step>
</Steps>
### Cost Advantages
R2 offers significant cost advantages over S3:
- **No egress fees**: Download data at no cost
- **Lower storage costs**: ~$0.015/GB vs S3's ~$0.023/GB
- **Global edge access**: Fast access from anywhere
The background sync worker manages the upload queue with these characteristics:
batchSize items per intervalmaxQueueSize is exceededmaxRetries timesWhen an upload fails:
maxRetries attempts, item is marked as permanently failedmaxFailedAgeMs are discardedWhen backfillEnabled is set to true:
~/.cline/data/tasks/While prompt storage operates independently, it integrates with Cline's observability system:
task.created, task.completed track when conversations are generatedtask.conversation_turn, task.tokens provide usage metricsSee OpenTelemetry for configuring metrics export.
Monitor S3 upload activity with CloudWatch:
# View PutObject requests (uploads)
aws cloudwatch get-metric-statistics \
--namespace AWS/S3 \
--metric-name NumberOfObjects \
--dimensions Name=BucketName,Value=your-cline-prompts \
--start-time 2026-03-01T00:00:00Z \
--end-time 2026-03-08T00:00:00Z \
--period 3600 \
--statistics Sum
Cloudflare R2 provides built-in analytics in the dashboard:
At Rest:
In Transit:
Recommended IAM policies:
Bucket policies:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::your-cline-prompts/*",
"arn:aws:s3:::your-cline-prompts"
],
"Condition": {
"Bool": {
"aws:SecureTransport": "false"
}
}
}
]
}
S3 Server Access Logging:
aws s3api put-bucket-logging \
--bucket your-cline-prompts \
--bucket-logging-status '{
"LoggingEnabled": {
"TargetBucket": "your-log-bucket",
"TargetPrefix": "cline-prompts-access/"
}
}'
CloudTrail for API Calls:
Enable CloudTrail to track all S3 API operations on your bucket.
Implement retention policies based on your compliance requirements:
**Causes**:
- Upload rate slower than conversation creation rate
- Network connectivity issues
- Insufficient batch size or interval
**Solutions**:
1. Increase `batchSize` to process more items per interval
2. Decrease `intervalMs` to sync more frequently
3. Check network connectivity and credentials
4. Temporarily increase `maxQueueSize` while investigating
**Causes**:
- Invalid or expired credentials
- Insufficient IAM permissions
- Bucket policy denying access
**Solutions**:
1. Verify credentials are correct in remote config
2. Check IAM policy includes `s3:PutObject` permission
3. Review bucket policies for deny rules
4. Test with AWS CLI: `aws s3 cp test.txt s3://your-bucket/`
**Causes**:
- Incorrect endpoint URL
- Firewall blocking Cloudflare IPs
- Invalid account ID
**Solutions**:
1. Verify endpoint format: `https://<ACCOUNT_ID>.r2.cloudflarestorage.com`
2. Check firewall rules allow HTTPS to Cloudflare IPs
3. Confirm account ID in Cloudflare dashboard
4. Test with curl: `curl -I https://<ACCOUNT_ID>.r2.cloudflarestorage.com`
**Causes**:
- Large number of existing tasks
- Backfill queuing faster than upload processing
**Solutions**:
1. Disable backfill temporarily: `"backfillEnabled": false`
2. Let steady-state queue drain first
3. Increase `batchSize` and decrease `intervalMs`
4. Consider `maxQueueSize` increase during backfill period
5. Re-enable backfill once queue is stable
Enable debug logging to diagnose sync issues:
[ClineBlobStorage] and [SyncWorker] log entriesUse the built-in test connection feature:
// Programmatic test (for custom integrations)
import { testPromptUploading } from '@/core/controller/state/testPromptUploading'
await testPromptUploading(controller)
// Returns: { success: boolean, message: string }
Uploaded api_conversation_history.json files contain an array of messages:
[
{
"role": "user",
"content": [
{
"type": "text",
"text": "Create a React component for a todo list"
}
]
},
{
"role": "assistant",
"content": [
{
"type": "text",
"text": "I'll create a todo list component..."
},
{
"type": "tool_use",
"id": "toolu_123",
"name": "write_to_file",
"input": {
"path": "TodoList.tsx",
"content": "..."
}
}
]
}
]
This follows the Anthropic Messages API format.
Task metadata includes:
{
"taskId": "1234567890",
"createdAt": "2026-03-05T10:30:00Z",
"lastModified": "2026-03-05T11:45:00Z",
"modelInfo": {
"id": "claude-sonnet-4",
"provider": "anthropic"
},
"tokensUsed": {
"input": 1250,
"output": 3400
}
}