docs/en/06-advanced/06-tdgpt/11-system-requirements.md
The system requirements on this page apply to the machine running the anode for TDgpt. Note that these are guidelines, and actual requirements scale with model size, request concurrency (QPS), context/window length, and any local caching/feature store footprint.
System requirements also differ depending on whether you use inference only or also perform training or fine-tuning.
Typical use cases: forecasting, anomaly detection, conversational analytics, automated insights/report generation.
:::note
TDgpt can operate without a GPU. However, adding a GPU will improve latency and throughput.
:::
Typical use cases: model fine-tuning, retraining, or local validation of model updates.
If the anode pulls large volumes of time-series data or features from TDengine TSDB or TDengine IDMP, a 1 Gbps network is required. For high-throughput environments, 10 Gbps is recommended.