docs/linux.mdx
To install Ollama, run the following command:
curl -fsSL https://ollama.com/install.sh | sh
Download and extract the package:
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tar.zst \
| sudo tar x -C /usr
Start Ollama:
ollama serve
In another terminal, verify that Ollama is running:
ollama -v
If you have an AMD GPU, also download and extract the additional ROCm package:
curl -fsSL https://ollama.com/download/ollama-linux-amd64-rocm.tar.zst \
| sudo tar x -C /usr
Download and extract the ARM64-specific package:
curl -fsSL https://ollama.com/download/ollama-linux-arm64.tar.zst \
| sudo tar x -C /usr
Create a user and group for Ollama:
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)
Create a service file in /etc/systemd/system/ollama.service:
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
[Install]
WantedBy=multi-user.target
Then start the service:
sudo systemctl daemon-reload
sudo systemctl enable ollama
Download and install CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
nvidia-smi
Download and Install ROCm v7.
Start Ollama and verify it is running:
sudo systemctl start ollama
sudo systemctl status ollama
To customize the installation of Ollama, you can edit the systemd service file or the environment variables by running:
sudo systemctl edit ollama
Alternatively, create an override file manually in /etc/systemd/system/ollama.service.d/override.conf:
[Service]
Environment="OLLAMA_DEBUG=1"
Update Ollama by running the install script again:
curl -fsSL https://ollama.com/install.sh | sh
Or by re-downloading Ollama:
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tar.zst \
| sudo tar x -C /usr
Use OLLAMA_VERSION environment variable with the install script to install a specific version of Ollama, including pre-releases. You can find the version numbers in the releases page.
For example:
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh
To view logs of Ollama running as a startup service, run:
journalctl -e -u ollama
Remove the ollama service:
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
Remove ollama libraries from your lib directory (either /usr/local/lib, /usr/lib, or /lib):
sudo rm -r $(which ollama | tr 'bin' 'lib')
Remove the ollama binary from your bin directory (either /usr/local/bin, /usr/bin, or /bin):
sudo rm $(which ollama)
Remove the downloaded models and Ollama service user and group:
sudo userdel ollama
sudo groupdel ollama
sudo rm -r /usr/share/ollama