docs/content/features/backends.md
LocalAI supports a variety of backends that can be used to run different types of AI models. There are core Backends which are included, and there are containerized applications that provide the runtime environment for specific model types, such as LLMs, diffusion models, or text-to-speech models.
The LocalAI web interface provides an intuitive way to manage your backends:
Each backend card displays:
Backend galleries are repositories that contain backend definitions. They work similarly to model galleries but are specifically for backends.
You can add backend galleries by specifying the Environment Variable LOCALAI_BACKEND_GALLERIES:
export LOCALAI_BACKEND_GALLERIES='[{"name":"my-gallery","url":"https://raw.githubusercontent.com/username/repo/main/backends"}]'
The URL needs to point to a valid yaml file, for example:
- name: "test-backend"
uri: "quay.io/image/tests:localai-backend-test"
alias: "foo-backend"
Where URI is the path to an OCI container image.
A backend gallery is a collection of YAML files, each defining a backend. Here's an example structure:
name: "llm-backend"
description: "A backend for running LLM models"
uri: "quay.io/username/llm-backend:latest"
alias: "llm"
tags:
- "llm"
- "text-generation"
You can pre-install backends when starting LocalAI using the LOCALAI_EXTERNAL_BACKENDS environment variable:
export LOCALAI_EXTERNAL_BACKENDS="llm-backend,diffusion-backend"
local-ai run
To create a new backend, you need to:
Your backend container should:
run.sh file that will be used to run the backendFor getting started, see the available backends in LocalAI here: https://github.com/mudler/LocalAI/tree/master/backend .
piper backend as an example: https://github.com/mudler/LocalAI/tree/master/backend/go/piperllama-cpp backend as an example: https://github.com/mudler/LocalAI/tree/master/backend/cpp/llama-cppBuild your container image:
docker build -t quay.io/username/my-backend:latest .
Push to a container registry:
docker push quay.io/username/my-backend:latest
Add your backend to a gallery:
LocalAI supports various types of backends: