README.md
๐ Docs โข ๐ฌ Slack โข ๐บ๏ธ Roadmap
English | ็ฎไฝไธญๆ | ๆฅๆฌ่ช
</div>Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:
@ menu in the chat side panel.You can find our documentation here.
The easiest way to start a Tabby server is by using the following Docker command:
docker run -it \
--gpus all -p 8080:8080 -v $HOME/.tabby:/data \
tabbyml/tabby \
serve --model StarCoder-1B --device cuda --chat-model Qwen2-1.5B-Instruct
For additional options (e.g inference type, parallelism), please refer to the documentation page.
Full guide at CONTRIBUTING.md;
git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby
If you have already cloned the repository, you could run the git submodule update --recursive --init command to fetch all submodules.
Set up the Rust environment by following this tutorial.
Install the required dependencies:
# For MacOS
brew install protobuf
# For Ubuntu / Debian
apt install protobuf-compiler libopenblas-dev
# For Ubuntu
apt install make sqlite3 graphviz
cargo build.... and don't forget to submit a Pull Request