docs/advanced.md
This guide covers advanced features and techniques for using Pipenv effectively in complex scenarios. These topics build on the basic functionality covered in other documentation sections.
When you need more control over the underlying pip commands that Pipenv executes, you can pass additional arguments directly to pip.
The --extra-pip-args option allows you to supply additional arguments to pip during installation:
$ pipenv install --extra-pip-args="--use-feature=truststore --proxy=127.0.0.1"
This is particularly useful for:
$ pipenv install --extra-pip-args="--use-feature=truststore"
$ pipenv install --extra-pip-args="--platform=win_amd64 --only-binary=:all:"
$ pipenv install pycurl --extra-pip-args="--global-option=--with-openssl-dir=/usr/local/opt/openssl"
Some packages require environment variables to be set during installation for proper compilation or configuration. You can set these variables before running pipenv install:
For packages that need CMake arguments, such as llama-cpp-python with CUDA support:
# Unix/Linux/macOS
$ CMAKE_ARGS="-DGGML_CUDA=on" pipenv install llama-cpp-python
# Or export for multiple commands
$ export CMAKE_ARGS="-DGGML_CUDA=on"
$ pipenv install llama-cpp-python
# For packages that need specific compiler flags
$ CFLAGS="-O3" CXXFLAGS="-O3" pipenv install some-package
# For packages using meson build system
$ MESON_ARGS="-Dfeature=enabled" pipenv install some-package
# For packages requiring specific library paths
$ LDFLAGS="-L/usr/local/lib" CPPFLAGS="-I/usr/local/include" pipenv install some-package
These environment variables are passed to pip and subsequently to the package's build system (setuptools, CMake, meson, etc.). The specific variables needed depend on the package being installed.
Pipenv provides several approaches for deploying applications in production environments.
The --deploy flag ensures that your Pipfile.lock is up-to-date with your Pipfile before installation:
$ pipenv install --deploy
This will fail if:
Pipfile.lock is out of datePipfile.lock is missingPipfile.lock doesn't match the PipfileThis is crucial for production deployments to ensure you're installing exactly what you expect.
For some deployment scenarios (like containerized applications), you may want to install packages directly to the system Python rather than in a virtual environment:
$ pipenv install --system --deploy
This installs all packages specified in Pipfile.lock to the system Python. Use this approach with caution, as it can potentially conflict with system packages.
To verify that your Pipfile.lock is up-to-date without installing packages:
$ pipenv verify
This is useful in CI/CD pipelines to ensure the lock file has been properly updated after changes to the Pipfile.
For Docker deployments, a multi-stage build approach is recommended:
FROM python:3.10-slim AS builder
WORKDIR /app
# Copy dependency files
COPY Pipfile Pipfile.lock ./
# Install pipenv and dependencies
RUN pip install pipenv && \
pipenv install --deploy --system
FROM python:3.10-slim
WORKDIR /app
# Copy installed packages from builder stage
COPY --from=builder /usr/local/lib/python3.10/site-packages /usr/local/lib/python3.10/site-packages
# Copy application code
COPY . .
# Run the application
CMD ["python", "app.py"]
This approach:
Pipenv can work with various Python distributions and installations.
To use a specific Python interpreter:
$ pipenv --python /path/to/python
This is useful when you have multiple Python versions installed or need to use a specific distribution.
Pipenv can work with Anaconda/Conda Python installations, though there are some important considerations.
To use Pipenv with Anaconda, specify the full path to the Conda Python interpreter:
$ pipenv --python /path/to/anaconda/bin/python
Important: You must provide the complete path to the Python executable. A partial version number like --python 3.10 may not work if Pipenv cannot find the Conda installation in its search paths.
To find your Conda Python path:
# With conda environment activated
$ which python
/home/user/anaconda3/envs/myenv/bin/python
# Or check conda info
$ conda info --envs
To allow the Pipenv virtualenv to access packages installed in Conda:
$ pipenv --python /path/to/anaconda/bin/python --site-packages
The --site-packages flag allows the virtual environment to access packages installed in the parent Python installation.
Virtualenv Creation Fails with Import Errors
If you encounter errors like ImportError: cannot import name '_io' or similar during virtualenv creation, this is typically due to outdated packages in your Conda installation:
# Update conda's virtualenv and Python packages
$ conda update python virtualenv
Python Version Not Found
If Pipenv reports that a Python version was not found:
Verify the Python version is installed in Conda:
$ conda search python
Use the full path instead of just the version number:
# Instead of this:
$ pipenv --python 3.10
# Use the full path:
$ pipenv --python ~/anaconda3/envs/py310/bin/python
Mixing Conda and Pipenv
While Pipenv can work with Conda Python, be aware that:
Pipenv automatically detects and works with pyenv:
# Set local Python version with pyenv
$ pyenv local 3.10.4
# Pipenv will use this version automatically
$ pipenv install
If the specified Python version isn't installed, Pipenv will prompt you to install it with pyenv:
$ pipenv --python 3.11
Warning: Python 3.11 was not found on your system...
Would you like us to install latest CPython 3.11 with pyenv? [Y/n]: y
Installing CPython 3.11.0 with pyenv...
Similar to pyenv, Pipenv also works with asdf:
# Install Python with asdf
$ asdf install python 3.10.4
# Use this version with Pipenv
$ pipenv --python 3.10.4
While Pipenv uses Pipfile and Pipfile.lock, you may need to generate traditional requirements.txt files for compatibility with other tools.
$ pipenv requirements > requirements.txt
This generates a requirements.txt file from your Pipfile.lock with exact versions.
$ pipenv requirements --dev > requirements-dev.txt
$ pipenv requirements --dev-only > dev-requirements.txt
$ pipenv requirements --hash > requirements.txt
This includes package hashes for additional security.
$ pipenv requirements --exclude-markers > requirements.txt
This removes environment markers (like python_version >= '3.7').
$ pipenv requirements --categories="docs,tests" > requirements-docs-tests.txt
This generates requirements for specific custom package categories.
Pipenv includes several advanced security features to help protect your projects.
Pipenv integrates with the safety package to scan for known vulnerabilities:
$ pipenv scan
This checks your dependencies against the PyUp Safety database of known vulnerabilities.
$ pipenv scan --db /path/to/custom/db
$ pipenv scan --ignore 12345
$ pipenv scan --output json > vulnerabilities.json
Pipenv can automatically install the required Python version if you have pyenv or asdf installed:
# Pipfile
[requires]
python_version = "3.11"
When you run pipenv install, it will check if Python 3.11 is available and prompt to install it if needed.
Pipenv allows you to define custom scripts in your Pipfile for common tasks.
[scripts]
start = "python app.py"
test = "pytest"
lint = "flake8 ."
format = "black ."
$ pipenv run start
$ pipenv run test
$ pipenv run test tests/test_api.py -v
You can define more complex scripts using the extended syntax:
[scripts]
start = {cmd = "python app.py"}
complex = {call = "package.module:function('arg1', 'arg2')"}
Pipenv automatically loads environment variables from .env files in your project directory:
# .env
DEBUG=True
DATABASE_URL=postgresql://user:password@localhost/dbname
These variables are available when you run pipenv shell or pipenv run.
$ PIPENV_DOTENV_LOCATION=/path/to/.env pipenv shell
You can use variable expansion in your .env files:
# .env
HOME_DIR=${HOME}
CONFIG_PATH=${HOME_DIR}/.config/app
In environments without internet access, you can still use Pipenv by preparing packages in advance.
On a connected system:
# Generate requirements with hashes
$ pipenv requirements --hash > requirements.txt
# Download packages
$ pip download -r requirements.txt -d ./packages
Transfer the packages directory to the air-gapped environment, then:
$ pip install --no-index --find-links=./packages -r requirements.txt
Here's an example tox.ini for testing with multiple Python versions:
[tox]
envlist = py37, py38, py39, py310, py311
[testenv]
deps = pipenv
commands =
pipenv install --dev
pipenv run pytest {posargs:tests}
name: Python Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.8', '3.9', '3.10', '3.11']
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install pipenv
pipenv install --dev
- name: Verify lock file
run: pipenv verify
- name: Run tests
run: pipenv run pytest
- name: Security scan
run: pipenv scan
Pipenv maintains a cache to speed up installations. You can control this cache:
# Clear the cache
$ pipenv lock --clear
# Set a custom cache location
$ export PIPENV_CACHE_DIR=/path/to/custom/cache
For faster installations during development:
# Skip lock file generation
$ export PIPENV_SKIP_LOCK=1
$ pipenv install package-name
Only use PIPENV_SKIP_LOCK during development, not in production environments.
For projects with many dependencies:
# Increase the maximum depth for dependency resolution
$ export PIPENV_MAX_DEPTH=20
$ pipenv lock
Pipenv works well with various tools and services in the Python ecosystem:
Pipenv allows you to quickly open installed packages in your editor:
$ pipenv open requests
This opens the source code of the requests package in your default editor (defined by the EDITOR environment variable).
You can specify a different editor for a one-time use:
$ EDITOR=code pipenv open flask
This is useful for:
These advanced features make Pipenv a powerful tool for Python dependency management in complex scenarios. By leveraging these capabilities, you can create more efficient, secure, and maintainable Python projects.
Remember that while these advanced features provide additional flexibility and power, they should be used judiciously. Always follow best practices for security and reproducibility, especially in production environments.