Back to Tensorrtx

Tutorials

docker/README.md

latest2.4 KB
Original Source

Tutorials

Introduction

This folder contains the docker and docker-compose file to build the development environment without pain.

Prerequisites

  • OS: Linux or WSL2
  • docker
  • nvidia-container-toolkit
  • (Optional but recommended) docker-compose

Usage

  1. (With docker-compose) configure the .env file, change DATA_DIR to your mount point, such as your code or data folder, etc, comment the volumes in docker compose file if not necessariy needed

  2. Build image:

bash
docker compose -f docker-compose.yml build
  1. Run a container at background:
bash
docker compose -f docker-compose.yml up -d
  1. Attach to this container with your IDE and have fun!

HowTos

How to build and run with docker?

bash
docker build -f docker/x86_64.dockerfile -v .
docker run -it --gpus all --privileged --net=host --ipc=host -v  /bin/bash

How to build image with other TensorRT version?

Change the TAG on top of the .dockerfile. Note: all images are officially owned by NVIDIA NGC, which requires a registration before pulling. For this repo, the mainly used TAG would be:

Container ImageContainer OSDriverCUDATensorRTTorchRecommended
20.12-py3Ubuntu 20.0445511.27.2.21.8.0
24.01-py3Ubuntu 22.0454512.38.6.12.2.0
24.04-py3Ubuntu 22.0454512.48.6.32.3.0
24.09-py3Ubuntu 22.0456012.610.4.02.5.0

For more detail of the support matrix, please check HERE

How to customize the opencv in the image?

If prebuilt package from apt cannot meet your requirements, please refer to the demo code in .dockerfile to build opencv from source.

How to solve failiures when building image?

For 443 timeout or any similar network issues, a proxy may required. To make your host proxy work for building env of docker, please change the build node inside docker-compose file like this:

YAML
    build:
      dockerfile: x86_64.dockerfile
      args:
        HTTP_PROXY: ${PROXY}
        HTTPS_PROXY: ${PROXY}
        ALL_PROXY: ${PROXY}
        http_proxy: ${PROXY}
        https_proxy: ${PROXY}
        all_proxy: ${PROXY}

then add PROXY="http://xxx:xxx" in .env file

Note

The older version support, like TensorRT version < 8, may be deprecated in the future.