docs/api/introduction.mdx
Ollama's API allows you to run and interact with models programatically.
If you're just getting started, follow the quickstart documentation to get up and running with Ollama's API.
After installation, Ollama's API is served by default at:
http://localhost:11434/api
For running cloud models on ollama.com, the same API is available with the following base URL:
https://ollama.com/api
Once Ollama is running, its API is automatically available and can be accessed via curl:
curl http://localhost:11434/api/generate -d '{
"model": "gemma3",
"prompt": "Why is the sky blue?"
}'
Ollama has official libraries for Python and JavaScript:
Several community-maintained libraries are available for Ollama. For a full list, see the Ollama GitHub repository.
Ollama's API isn't strictly versioned, but the API is expected to be stable and backwards compatible. Deprecations are rare and will be announced in the release notes.