docker/README.md
This guide provides instructions for building and running the LiteLLM application using Docker and Docker Compose.
To build and run the application, you will use the docker-compose.yml file located in the root of the project. This file is configured to use the Dockerfile.non_root for a secure, non-root container environment.
The application requires a LITELLM_MASTER_KEY for signing and validating tokens. You must set this key as an environment variable before running the application.
Create a .env file in the root of the project and add the following line:
LITELLM_MASTER_KEY=your-secret-key
Replace your-secret-key with a strong, randomly generated secret.
Once you have set the LITELLM_MASTER_KEY, you can build and run the containers using the following command:
docker compose up -d --build
This command will:
Dockerfile.non_root.litellm, litellm_db, and prometheus services in detached mode (-d).--build flag ensures that the image is rebuilt if there are any changes to the Dockerfile or the application code.You can check the status of the running containers with the following command:
docker compose ps
To view the logs of the litellm container, run:
docker compose logs -f litellm
To stop the running containers, use the following command:
docker compose down
To ensure changes are safe for non-root, read-only root filesystems and restricted egress, always validate with the hardened compose file:
docker compose -f docker-compose.yml -f docker-compose.hardened.yml build --no-cache
docker compose -f docker-compose.yml -f docker-compose.hardened.yml up -d
This setup:
docker/Dockerfile.non_root with Prisma engines and Node toolchain baked into the image./app/cache (Prisma/NPM cache; backing PRISMA_BINARY_CACHE_DIR, NPM_CONFIG_CACHE, XDG_CACHE_HOME)/app/migrations (Prisma migration workspace; backing LITELLM_MIGRATION_DIR)/var/lib/litellm/ui (pre-restructured Next.js UI with .litellm_ui_ready marker)/var/lib/litellm/assets (UI logos and assets)You should also verify offline Prisma behaviour with:
docker run --rm --network none --entrypoint prisma ghcr.io/berriai/litellm:main-stable --version
This command should succeed (showing engine versions) even with --network none, confirming that Prisma binaries are available without network access.
build_admin_ui.sh: not found: This error can occur if the Docker build context is not set correctly. Ensure that you are running the docker-compose command from the root of the project.Master key is not initialized: This error means the LITELLM_MASTER_KEY environment variable is not set. Make sure you have created a .env file in the project root with the LITELLM_MASTER_KEY defined.