documentation/docs/get-started/setup.mdx
Learn about how to self-host Khoj on your own machine.
Benefits to self-hosting:
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
These are the general setup instructions for self-hosted Khoj. You can install the Khoj server using either Docker or Pip.
:::info[First Run] Restart your Khoj server after the first run to ensure all settings are applied correctly. :::
<Tabs groupId="server" queryString> <TabItem value="docker" label="Docker"> <Tabs groupId="operating-systems" queryString="os"> <TabItem value="macos" label="MacOS"> <h3>Prerequisites</h3> <h4>Docker</h4> - *Option 1*: Click here to install [Docker Desktop](https://docs.docker.com/desktop/install/mac-install/). Make sure you also install the [Docker Compose](https://docs.docker.com/desktop/install/mac-install/) tool. - *Option 2*: Use [Homebrew](https://brew.sh/) to install Docker and Docker Compose.
```shell
brew install --cask docker
brew install docker-compose
```
<h3>Setup</h3>
1. Download the Khoj docker-compose.yml file [from Github](https://github.com/khoj-ai/khoj/blob/master/docker-compose.yml)
```shell
mkdir ~/.khoj && cd ~/.khoj
wget https://raw.githubusercontent.com/khoj-ai/khoj/master/docker-compose.yml
```
2. Configure the environment variables in the `docker-compose.yml`
- Set `KHOJ_ADMIN_PASSWORD`, `KHOJ_DJANGO_SECRET_KEY` (and optionally the `KHOJ_ADMIN_EMAIL`) to something secure. This allows you to customize Khoj later via the admin panel.
- Set `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, or `GEMINI_API_KEY` to your API key if you want to use OpenAI, Anthropic or Gemini commercial chat models respectively.
- Uncomment `OPENAI_BASE_URL` to use [Ollama](/advanced/ollama?type=first-run&server=docker#setup) running on your host machine. Or set it to the URL of your OpenAI compatible API like vLLM or [LMStudio](/advanced/lmstudio).
3. Start Khoj by running the following command in the same directory as your docker-compose.yml file.
```shell
cd ~/.khoj
docker-compose up
```
</TabItem>
<TabItem value="windows" label="Windows">
<h3>Prerequisites</h3>
1. Install [WSL2](https://learn.microsoft.com/en-us/windows/wsl/install) and restart your machine
```shell
# Run in PowerShell
wsl --install
```
2. Install [Docker Desktop](https://docs.docker.com/desktop/install/windows-install/) with **[WSL2 backend](https://docs.docker.com/desktop/wsl/#turn-on-docker-desktop-wsl-2)** (default)
<h3>Setup</h3>
1. Download the Khoj docker-compose.yml file [from Github](https://github.com/khoj-ai/khoj/blob/master/docker-compose.yml)
```shell
# Windows users should use their WSL2 terminal to run these commands
mkdir ~/.khoj && cd ~/.khoj
wget https://raw.githubusercontent.com/khoj-ai/khoj/master/docker-compose.yml
```
2. Configure the environment variables in the `docker-compose.yml`
- Set `KHOJ_ADMIN_PASSWORD`, `KHOJ_DJANGO_SECRET_KEY` (and optionally the `KHOJ_ADMIN_EMAIL`) to something secure. This allows you to customize Khoj later via the admin panel.
- Set `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, or `GEMINI_API_KEY` to your API key if you want to use OpenAI, Anthropic or Gemini commercial chat models respectively.
- Uncomment `OPENAI_BASE_URL` to use [Ollama](/advanced/ollama) running on your host machine. Or set it to the URL of your OpenAI compatible API like vLLM or [LMStudio](/advanced/lmstudio).
3. Start Khoj by running the following command in the same directory as your docker-compose.yml file.
```shell
# Windows users should use their WSL2 terminal to run these commands
cd ~/.khoj
docker-compose up
```
</TabItem>
<TabItem value="linux" label="Linux">
<h3>Prerequisites</h3>
Install [Docker Desktop](https://docs.docker.com/desktop/install/linux/).
You can also use your package manager to install Docker Engine & Docker Compose.
<h3>Setup</h3>
1. Download the Khoj docker-compose.yml file [from Github](https://github.com/khoj-ai/khoj/blob/master/docker-compose.yml)
```shell
mkdir ~/.khoj && cd ~/.khoj
wget https://raw.githubusercontent.com/khoj-ai/khoj/master/docker-compose.yml
```
2. Configure the environment variables in the `docker-compose.yml`
- Set `KHOJ_ADMIN_PASSWORD`, `KHOJ_DJANGO_SECRET_KEY` (and optionally the `KHOJ_ADMIN_EMAIL`) to something secure. This allows you to customize Khoj later via the admin panel.
- Set `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, or `GEMINI_API_KEY` to your API key if you want to use OpenAI, Anthropic or Gemini commercial chat models respectively.
- Uncomment `OPENAI_BASE_URL` to use [Ollama](/advanced/ollama) running on your host machine. Or set it to the URL of your OpenAI compatible API like vLLM or [LMStudio](/advanced/lmstudio).
3. Start Khoj by running the following command in the same directory as your docker-compose.yml file.
```shell
cd ~/.khoj
docker-compose up
```
</TabItem>
</Tabs>
:::info[Remote Access] By default Khoj is only accessible on the machine it is running. To access Khoj from a remote machine see Remote Access Docs. :::
Your setup is complete once you see 🌖 Khoj is ready to engage in the server logs on your terminal.
</TabItem>
<TabItem value="pip" label="Pip">
Run the following command in your terminal to install the Khoj server.
<Tabs groupId="operating-systems" queryString="os"> <TabItem value="macos" label="MacOS"> <Tabs groupId="gpu" queryString="gpu"> <TabItem value="arm" label="ARM/M1+"> ```shell CMAKE_ARGS="-DGGML_METAL=on" python -m pip install 'khoj[local]' ``` </TabItem> <TabItem value="intel" label="Intel"> ```shell python -m pip install 'khoj[local]' ``` </TabItem> </Tabs> </TabItem> <TabItem value="windows" label="Windows"> Run the following command in PowerShell on Windows <Tabs groupId="gpu" queryString="gpu"> <TabItem value="cpu" label="CPU"> ```shell # Install Khoj py -m pip install 'khoj[local]' ``` </TabItem> <TabItem value="nvidia" label="NVIDIA (CUDA) GPU"> ```shell # 1. To use NVIDIA (CUDA) GPU $env:CMAKE_ARGS = "-DGGML_CUDA=on" # 2. Install Khoj py -m pip install 'khoj[local]' ``` </TabItem> <TabItem value="amd" label="AMD (ROCm) GPU"> ```shell # 1. To use AMD (ROCm) GPU $env:CMAKE_ARGS = "-DGGML_HIPBLAS=on" # 2. Install Khoj py -m pip install 'khoj[local]' ``` </TabItem> <TabItem value="vulkan" label="VULKAN GPU"> ```shell # 1. To use VULCAN GPU $env:CMAKE_ARGS = "-DGGML_VULKAN=on" # 2. Install Khoj py -m pip install 'khoj[local]' ``` </TabItem> </Tabs> </TabItem> <TabItem value="linux" label="Linux"> <Tabs groupId="gpu" queryString="gpu"> <TabItem value="cpu" label="CPU"> ```shell python -m pip install 'khoj[local]' ``` </TabItem> <TabItem value="nvidia" label="NVIDIA (CUDA) GPU"> ```shell CMAKE_ARGS="-DGGML_CUDA=on" FORCE_CMAKE=1 python -m pip install 'khoj[local]' ``` </TabItem> <TabItem value="amd" label="AMD (ROCm) GPU"> ```shell CMAKE_ARGS="-DGGML_HIPBLAS=on" FORCE_CMAKE=1 python -m pip install 'khoj[local]' ``` </TabItem> <TabItem value="vulkan" label="VULKAN GPU"> ```shell CMAKE_ARGS="-DGGML_VULKAN=on" FORCE_CMAKE=1 python -m pip install 'khoj[local]' ``` </TabItem> </Tabs> </TabItem> </Tabs> <h3>2. Start Khoj Server</h3>Run the following command from your terminal to start the Khoj service.
USE_EMBEDDED_DB="true" khoj --anonymous-mode
--anonymous-mode allows access to Khoj without requiring login. This is usually fine for local only, single user setups. If you need authentication follow the authentication setup docs.
:::tip[Auto Start]
To start Khoj automatically in the background use Task scheduler on Windows or Cron on Mac, Linux (e.g. with @reboot khoj)
:::
</TabItem>
</Tabs>
You can now open the web app at http://localhost:42110 and start interacting!
Nothing else is necessary, but you can customize your setup further by following the steps below.
:::info[CSRF Error] Ensure you are using localhost, not 127.0.0.1, to access the admin panel to avoid the CSRF error. :::
:::info[CSRF Trusted Origin or Unset Cookie Error] If using a load balancer/reverse_proxy in front of your Khoj server: Set the environment variable KHOJ_ALLOWED_DOMAIN=your-internal-ip-or-domain to avoid this error. If unset, it defaults to KHOJ_DOMAIN. :::
:::info[DISALLOWED HOST or Bad Request (400) Error] You may hit this if you try access Khoj exposed on a custom domain (e.g. 192.168.12.3 or example.com) or over HTTP. Set the environment variables KHOJ_DOMAIN=your-external-ip-or-domain and KHOJ_NO_HTTPS=True if required to avoid this error. :::
:::tip[Note] Using Safari on Mac? You might not be able to login to the admin panel. Try using Chrome or Firefox instead. :::
<h4>Configure Chat Model</h4> Setup which chat model you'd want to use. Khoj supports local and online chat models. <Tabs groupId="chatmodel" queryString> <TabItem value="openai" label="OpenAI">:::info[Ollama Integration] Using Ollama? See the Ollama Integration section for more custom setup instructions. :::
OpenAI
2. Create a new chat model
- Set the chat-model field to an OpenAI chat model. Example: gpt-4o.
- Make sure to set the model-type field to OpenAI.
- If your model supports vision, set the vision enabled field to true. This is currently only supported for OpenAI models with vision capabilities.
- The tokenizer and max-prompt-size fields are optional. Set them only if you're sure of the tokenizer or token limit for the model you're using. Contact us if you're unsure what to do here.
</TabItem> <TabItem value="anthropic" label="Anthropic">
Anthropic. Do not configure the API base url.chat-model field to an Anthropic chat model. Example: claude-3-5-sonnet-20240620.model-type field to Anthropic.ai model api field to the Anthropic AI Model API you created in step 1.
</TabItem>
:::tip[System Requirements]
api url field to the url of your local ai model provider like http://localhost:11434/v1/ for Ollama:::tip[Multiple Chat Models]
Set your preferred default chat model in the Default, Advanced fields of your ServerChatSettings.
Khoj uses these chat model for all intermediate steps like intent detection, web search etc.
:::
:::info[Chat Model Fields]
tokenizer and max-prompt-size fields are optional. Set them only if you're sure of the tokenizer or token limit for the model you're using. This improves context stuffing. Contact us if you're unsure what to do here.vision enabled field for OpenAI models with vision capabilities like gpt-4o. Vision capabilities in other chat models is not currently utilized.
:::The Khoj web app is available by default to chat, search and configure Khoj.
You can also install a Khoj client to easily access it from Obsidian, Emacs, Whatsapp or your OS and keep your documents synced with Khoj.
:::info[Note]
Set the host URL on your clients settings page to your Khoj server URL. By default, use http://127.0.0.1:42110 or http://localhost:42110. Note that localhost may not work in all cases.
:::
# delete khoj postgres db
dropdb khoj -U postgres
```
</TabItem>
<TabItem value="docker" label="Docker">
Run the command below from the same directory where you have your `docker-compose` file.
This will remove the server containers, networks, images and volumes.
```shell
docker-compose down --volumes
```
</TabItem>
pipx to install Khoj to avoid dependency conflicts with other python packages.
pipx install khoj
khoj using the standard steps described earlierpip install khoj fails while building the tokenizers dependency. Complains about Rust.brew install rustup
rustup-init
source ~/.cargo/env