docs/content/overview.md
+++ title = "Overview" weight = 1 toc = true description = "What is LocalAI?" tags = ["Beginners"] categories = [""] url = "/docs/overview" author = "Ettore Di Giacinto" icon = "info" +++
LocalAI is your complete AI stack for running AI models locally. It's designed to be simple, efficient, and accessible, providing a drop-in replacement for OpenAI's API while keeping your data private and secure.
In today's AI landscape, privacy, control, and flexibility are paramount. LocalAI addresses these needs by:
LocalAI is a single binary (or container) that gives you everything you need:
LocalAI integrates LocalAGI (agent platform) and LocalRecall (semantic memory) as built-in libraries — no separate installation needed.
LocalAI can be installed in several ways. Docker is the recommended installation method for most users as it provides the easiest setup and works across all platforms.
The quickest way to get started with LocalAI is using Docker:
docker run -p 8080:8080 --name local-ai -ti localai/localai:latest-cpu
Then open http://localhost:8080 to access the web interface, install models, and start chatting.
For GPU support, see the [Container images reference]({{% relref "getting-started/container-images" %}}) or the [Quickstart guide]({{% relref "getting-started/quickstart" %}}).
For complete installation instructions including Docker, macOS, Linux, Kubernetes, and building from source, see the Installation guide.
LocalAI is a community-driven project. You can:
Ready to dive in? Here are some recommended next steps:
LocalAI is MIT licensed, created and maintained by Ettore Di Giacinto.