documentation/docs/advanced/litellm.md
:::info This is only helpful for self-hosted users. If you're using Khoj Cloud, you're limited to our first-party models. :::
:::info Khoj natively supports local LLMs available on HuggingFace in GGUF format. Using an OpenAI API proxy with Khoj maybe useful for ease of setup, trying new models or using commercial LLMs via API. :::
LiteLLM exposes an OpenAI compatible API that proxies requests to other LLM API services. This provides a standardized API to interact with both open-source and commercial LLMs.
Using LiteLLM with Khoj makes it possible to turn any LLM behind an API into your personal AI agent.
pip install litellm[proxy]
export MISTRAL_API_KEY=<MISTRAL_API_KEY>
litellm --model mistral/mistral-tiny --drop_params
litellmany string<URL of your Openai Proxy API>llama3.1 (replace with the name of your local model)Openai20000 (replace with the max prompt size of your model)