README.md
</a>
<a href="https://github.com/eosphoros-ai/DB-GPT">
</a>
<a href="http://dbgpt.cn/">
</a>
<a href="https://opensource.org/licenses/MIT">
</a>
<a href="https://github.com/eosphoros-ai/DB-GPT/releases">
</a>
<a href="https://github.com/eosphoros-ai/DB-GPT/issues">
</a>
<a href="https://x.com/DBGPT_AI">
</a>
<a href="https://medium.com/@dbgpt0506">
</a>
<a href="https://space.bilibili.com/3537113070963392">
</a>
<a href="https://join.slack.com/t/slack-inu2564/shared_invite/zt-29rcnyw2b-N~ubOD9kFc7b7MDOAM1otA">
</a>
<a href="https://codespaces.new/eosphoros-ai/DB-GPT">
</a>
Documents | Contact Us | Community | Paper
</div>An open-source AI data assistant that connects to your data, writes SQL and code, runs skills in sandboxed environments, and turns analysis into reports, insights, and action.
DB-GPT is an open-source agentic AI data assistant for the next generation of AI + Data products.
It helps users and teams:
DB-GPT is also a platform for building AI-native data agents, workflows, and applications with agents, AWEL, RAG, and multi-model support.
Plan tasks, break work into steps, call tools, and complete analysis workflows end to end.
Generate SQL and code to query data, clean datasets, compute metrics, and produce outputs.
Work across structured and unstructured sources, including databases, spreadsheets, documents, and knowledge bases.
Package domain knowledge, analysis methods, and execution workflows into reusable skills.
Run code and tools in isolated environments for safer, more reliable analysis.
Connect files, databases, and knowledge bases in one workspace.
Let AI reason through the task, write SQL and code, and execute step by step.
Load reusable skills for repeatable business analysis workflows.
Produce charts, dashboards, HTML reports, and decision-ready outputs.
Get DB-GPT running in minutes with the one-line installer (macOS & Linux):
curl -fsSL https://raw.githubusercontent.com/eosphoros-ai/DB-GPT/main/scripts/install/install.sh | bash
Or specify a profile and API key directly:
curl -fsSL https://raw.githubusercontent.com/eosphoros-ai/DB-GPT/main/scripts/install/install.sh \
| OPENAI_API_KEY=sk-xxx bash -s -- --profile openai
For Kimi 2.5 via Moonshot API:
curl -fsSL https://raw.githubusercontent.com/eosphoros-ai/DB-GPT/main/scripts/install/install.sh \
| MOONSHOT_API_KEY=sk-xxx bash -s -- --profile kimi
For MiniMax via the OpenAI-compatible API:
curl -fsSL https://raw.githubusercontent.com/eosphoros-ai/DB-GPT/main/scripts/install/install.sh \
| MINIMAX_API_KEY=sk-xxx bash -s -- --profile minimax
Already have a local DB-GPT checkout? Reuse it instead of cloning ~/.dbgpt/DB-GPT:
OPENAI_API_KEY=sk-xxx \
bash scripts/install/install.sh --profile openai --repo-dir "$(pwd)" --yes
Or reuse your local repo with Kimi 2.5:
MOONSHOT_API_KEY=sk-xxx \
bash scripts/install/install.sh --profile kimi --repo-dir "$(pwd)" --yes
Or reuse your local repo with MiniMax:
MINIMAX_API_KEY=sk-xxx \
bash scripts/install/install.sh --profile minimax --repo-dir "$(pwd)" --yes
After installation, start the server with the generated profile config:
cd ~/.dbgpt/DB-GPT && uv run dbgpt start webserver --profile <profile>
Then open http://localhost:5670.
Prefer to review the script first?
bashcurl -fsSL https://raw.githubusercontent.com/eosphoros-ai/DB-GPT/main/scripts/install/install.sh -o install.sh less install.sh bash install.sh --profile openai
Install DB-GPT from PyPI and start it with a single command — no source checkout required.
Prerequisites: Python 3.10+ and uv (recommended) or pip.
1. Install
# Recommended: use uv
uv pip install dbgpt-app
# Or with pip
pip install dbgpt-app
The default installation includes the core framework (CLI, FastAPI, Agent), OpenAI-compatible LLM support, DashScope / Tongyi support, RAG document parsing, and ChromaDB vector store.
2. Start
dbgpt start
On first run, an interactive setup wizard will guide you through choosing an LLM provider and entering your API key. Once complete, the web server starts automatically.
3. Open the Web UI
Visit http://localhost:5670 — you're all set! 🎉
For Docker, local GPU models (vLLM, llama.cpp), or manual source-code setup, see the full docs:
| LLM | Supported |
|---|---|
| LLaMA | ✅ |
| LLaMA-2 | ✅ |
| BLOOM | ✅ |
| BLOOMZ | ✅ |
| Falcon | ✅ |
| Baichuan | ✅ |
| Baichuan2 | ✅ |
| InternLM | ✅ |
| Qwen | ✅ |
| XVERSE | ✅ |
| ChatGLM2 | ✅ |
More Information about Text2SQL finetune
🔥🔥🔥 <a href="https://huggingface.co/deepseek-ai/DeepSeek-V3-0324">DeepSeek-V3-0324</a>
🔥🔥🔥 <a href="https://huggingface.co/deepseek-ai/DeepSeek-R1">DeepSeek-R1</a>
🔥🔥🔥 <a href="https://huggingface.co/deepseek-ai/DeepSeek-V3">DeepSeek-V3</a>
🔥🔥🔥 <a href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-70B">DeepSeek-R1-Distill-Llama-70B</a>
🔥🔥🔥 <a href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B">DeepSeek-R1-Distill-Qwen-32B</a>
🔥🔥🔥 <a href="https://huggingface.co/deepseek-ai/DeepSeek-Coder-V2-Instruct">DeepSeek-Coder-V2-Instruct</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">Qwen</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/Qwen/Qwen3-235B-A22B">Qwen3-235B-A22B</a>
🔥🔥🔥 <a href="https://huggingface.co/Qwen/Qwen3-30B-A3B">Qwen3-30B-A3B</a>
🔥🔥🔥 <a href="https://huggingface.co/Qwen/Qwen3-32B">Qwen3-32B</a>
🔥🔥🔥 <a href="https://huggingface.co/Qwen/QwQ-32B">QwQ-32B</a>
🔥🔥🔥 <a href="https://huggingface.co/Qwen/Qwen2.5-Coder-32B-Instruct">Qwen2.5-Coder-32B-Instruct</a>
🔥🔥🔥 <a href="https://huggingface.co/Qwen/Qwen2.5-Coder-14B-Instruct">Qwen2.5-Coder-14B-Instruct</a>
🔥🔥🔥 <a href="https://huggingface.co/Qwen/Qwen2.5-72B-Instruct">Qwen2.5-72B-Instruct</a>
🔥🔥🔥 <a href="https://huggingface.co/Qwen/Qwen2.5-32B-Instruct">Qwen2.5-32B-Instruct</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">GLM</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/THUDM/GLM-Z1-32B-0414">GLM-Z1-32B-0414</a>
🔥🔥🔥 <a href="https://huggingface.co/THUDM/GLM-4-32B-0414">GLM-4-32B-0414</a>
🔥🔥🔥 <a href="https://huggingface.co/THUDM/glm-4-9b-chat">Glm-4-9b-chat</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">Llama</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct">Meta-Llama-3.1-405B-Instruct</a>
🔥🔥🔥 <a href="https://huggingface.co/meta-llama/Meta-Llama-3.1-70B-Instruct">Meta-Llama-3.1-70B-Instruct</a>
🔥🔥🔥 <a href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct">Meta-Llama-3.1-8B-Instruct</a>
🔥🔥🔥 <a href="https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct">Meta-Llama-3-70B-Instruct</a>
🔥🔥🔥 <a href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct">Meta-Llama-3-8B-Instruct</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">Gemma</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/google/gemma-2-27b-it">gemma-2-27b-it</a>
🔥🔥🔥 <a href="https://huggingface.co/google/gemma-2-9b-it">gemma-2-9b-it</a>
🔥🔥🔥 <a href="https://huggingface.co/google/gemma-7b-it">gemma-7b-it</a>
🔥🔥🔥 <a href="https://huggingface.co/google/gemma-2b-it">gemma-2b-it</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">Yi</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/01-ai/Yi-1.5-34B-Chat">Yi-1.5-34B-Chat</a>
🔥🔥🔥 <a href="https://huggingface.co/01-ai/Yi-1.5-9B-Chat">Yi-1.5-9B-Chat</a>
🔥🔥🔥 <a href="https://huggingface.co/01-ai/Yi-1.5-6B-Chat">Yi-1.5-6B-Chat</a>
🔥🔥🔥 <a href="https://huggingface.co/01-ai/Yi-34B-Chat">Yi-34B-Chat</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">Starling</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/Nexusflow/Starling-LM-7B-beta">Starling-LM-7B-beta</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">SOLAR</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0">SOLAR-10.7B</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">Mixtral</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1">Mixtral-8x7B</a>
</td>
</tr>
<tr>
<td align="center" valign="middle">Phi</td>
<td align="center" valign="middle">✅</td>
<td>
🔥🔥🔥 <a href="https://huggingface.co/collections/microsoft/phi-3-6626e15e9585a200d2d761e3">Phi-3</a>
</td>
</tr>
</tbody>
</table>
We protect data privacy and execution safety through private model deployment, proxy desensitization, and sandboxed execution mechanisms.
We believe the future of data products goes beyond dashboards.
The next generation of AI + Data products will be:
DB-GPT aims to help developers and enterprises build that future.
The MIT License (MIT)
If you want to understand the overall architecture of DB-GPT, please cite <a href="https://arxiv.org/abs/2312.17449" target="_blank">Paper</a> and <a href="https://arxiv.org/abs/2404.10209" target="_blank">Paper</a>
If you want to learn about using DB-GPT for Agent development, please cite the <a href="https://arxiv.org/abs/2412.13520" target="_blank">Paper</a>
@article{xue2023dbgpt,
title={DB-GPT: Empowering Database Interactions with Private Large Language Models},
author={Siqiao Xue and Caigao Jiang and Wenhui Shi and Fangyin Cheng and Keting Chen and Hongjun Yang and Zhiping Zhang and Jianshan He and Hongyang Zhang and Ganglin Wei and Wang Zhao and Fan Zhou and Danrui Qi and Hong Yi and Shaodong Liu and Faqiang Chen},
year={2023},
journal={arXiv preprint arXiv:2312.17449},
url={https://arxiv.org/abs/2312.17449}
}
@misc{huang2024romasrolebasedmultiagentdatabase,
title={ROMAS: A Role-Based Multi-Agent System for Database monitoring and Planning},
author={Yi Huang and Fangyin Cheng and Fan Zhou and Jiahui Li and Jian Gong and Hongjun Yang and Zhidong Fan and Caigao Jiang and Siqiao Xue and Faqiang Chen},
year={2024},
eprint={2412.13520},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2412.13520},
}
@inproceedings{xue2024demonstration,
title={Demonstration of DB-GPT: Next Generation Data Interaction System Empowered by Large Language Models},
author={Siqiao Xue and Danrui Qi and Caigao Jiang and Wenhui Shi and Fangyin Cheng and Keting Chen and Hongjun Yang and Zhiping Zhang and Jianshan He and Hongyang Zhang and Ganglin Wei and Wang Zhao and Fan Zhou and Hong Yi and Shaodong Liu and Hongjun Yang and Faqiang Chen},
year={2024},
booktitle = "Proceedings of the VLDB Endowment",
url={https://arxiv.org/abs/2404.10209}
}
Thanks to everyone who has contributed to DB-GPT! Your ideas, code, comments, and even sharing them at events and on social platforms can make DB-GPT better. We are working on building a community, if you have any ideas for building the community, feel free to contact us.