cookbook/06_storage/gcs/README.md
Examples demonstrating Google Cloud Storage (GCS) integration with Agno agents using JSON blob storage.
uv pip install google-cloud-storage
from agno.agent import Agent
from agno.storage.gcs_json import GCSJsonDb
db = GCSJsonDb(
bucket_name="your-bucket-name",
)
agent = Agent(
db=db,
add_history_to_context=True,
)
Set up authentication using one of these methods:
# Using gcloud CLI
gcloud auth application-default login
# Using environment variable
export GOOGLE_APPLICATION_CREDENTIALS="path/to/service-account.json"
Ensure your account has Storage Admin permissions:
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="user:[email protected]" \
--role="roles/storage.admin"
Install the required Python packages:
uv pip install google-auth google-cloud-storage openai ddgs
In the example script, a global variable DEBUG_MODE controls whether the bucket contents are printed at the end of execution.
Set DEBUG_MODE = True in the script to see content of the bucket.
gcloud init
gcloud auth application-default login
python gcs_json_storage_for_agent.py
If you want to test the storage functionality locally without using real GCS, you can use fake-gcs-server :
Make sure Docker is installed on your system.
docker-compose.yml File** in your project root with the following content:version: '3.8'
services:
fake-gcs-server:
image: fsouza/fake-gcs-server:latest
ports:
- "4443:4443"
command: ["-scheme", "http", "-port", "4443", "-public-host", "localhost"]
volumes:
- ./fake-gcs-data:/data
docker-compose up -d
This will start the fake GCS server on http://localhost:4443.
Set the environment variable so the GCS client directs API calls to the emulator:
export STORAGE_EMULATOR_HOST="http://localhost:4443"
python gcs_json_for_agent.py
When using Fake GCS, authentication isn’t enforced. The client will automatically detect the emulator endpoint.