deploy/kv/backup.md
KV databases hosted on Deno Deploy can be continuously backed up to your own S3-compatible storage buckets. This is in addition to the replication and backups that we internally perform for all data stored in hosted Deno KV databases to ensure high availability and data durability.
This backup happens continuously with very little lag, enabling point-in-time-recovery and live replication. Enabling backup for KV databases unlocks various interesting use-cases:
First you must create a bucket on AWS:
<deno-tabs group-id="aws-tool"> <deno-tab value="console" label="AWS Console" default>aws s3api create-bucket --bucket <bucket-name> --region <region> --create-bucket-configuration LocationConstraint=<region>
(replace <bucket-name> and <region> with your own values)Then, create an IAM policy with PutObject access to the bucket, attach it to
an IAM user, and create access keys for that user:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "KVBackup",
"Effect": "Allow",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::<bucket-name>/*"
}
]
}
<bucket-name> with the name of the bucket you created earlier.<bucket-name> with
the name of the bucket you created earlier, then run it:
aws iam create-policy --policy-name <policy-name> --policy-document '{"Version":"2012-10-17","Statement":[{"Sid":"KVBackup","Effect":"Allow","Action":"s3:PutObject","Resource":"arn:aws:s3:::<bucket-name>/*"}]}'
<user-name> with a
name for the user you are creating, then run it:
aws iam create-user --user-name <user-name>
<policy-arn> with
the ARN of the policy you created in step 1, and <user-name> with the name
of the user you created in the previous step, then run it:
aws iam attach-user-policy --policy-arn <policy-arn> --user-name <user-name>
<user-name> with
the name of the user you created in step 2, then run it:
aws iam create-access-key --user-name <user-name>
Now visit the Deno Deploy dashboard, and click on the "KV" tab in your project. Scroll to the "Backup" section, and click on "AWS S3". Enter the bucket name, access key ID, and secret access key you created earlier, and the region the bucket is in. Then click "Save".
The backup will start immediately. Once the data has been backed up, and continuous backup is active, you will see the status change to "Active".
Google Cloud Storage (GCS) is compatible with the S3 protocol, and can also be used as a backup target.
First you must create a bucket on GCP:
<deno-tabs group-id="gcp-tool"> <deno-tab value="console" label="GCP Console" default>gcloud storage buckets create <bucket-name> --location <location>
(replace <bucket-name> and <location> with your own values)Then, create a service account with Storage Object Admin access to the bucket,
and create an HMAC access key for the service account:
<service-account-name> with a name for
the service account you are creating:
gcloud iam service-accounts create <service-account-name>
<bucket-name> with the name of the
bucket you created earlier, and <service-account-email> with the email of
the service account you created in the previous step:
gsutil iam ch serviceAccount:<service-account-email>:objectAdmin gs://<bucket-name>
<service-account-email> with the email
of the service account you created in the previous step:
gcloud storage hmac create <service-account-email>
accessId and secret and save them somewhere safe. You will need
them later, and you will not be able to retrieve them again.Now visit the Deno Deploy dashboard, and click on the "KV" tab in your project. Scroll to the "Backup" section, and click on "Google Cloud Storage". Enter the bucket name, access key ID, and secret access key you created earlier, and the region the bucket is in. Then click "Save".
The backup will start immediately. Once the data has been backed up, and continuous backup is active, you will see the status change to "Active".
S3 backups can be used with the denokv tool. Please refer to the
documentation for more details.