dev-tools/prometheus-local/README.md
This setup runs Prometheus and Kibana in Docker and starts Elasticsearch from source. Prometheus scrapes itself and forwards samples to Elasticsearch via the Prometheus remote write endpoint.
prometheus.yml configures scrape_configs and remote_write.docker-compose.yml runs Prometheus, Kibana, and Grafana behind a Traefik reverse proxy for convenient local URLs.Start Elasticsearch from source and ensure it listens on a non-loopback interface so that Prometheus in Docker can reach it.
./gradlew run --configuration-cache \
-Dtests.es.http.host=0.0.0.0 \
-Dtests.es.xpack.ml.enabled=false \
-Drun.license_type=trial \
-Dtests.heap.size=4G \
-Dtests.jvm.argline="-da -dsa -Dio.netty.leakDetection.level=simple"
cd dev-tools/prometheus-local
docker compose up -d
http://prometheus.localhosthttp://prometheus.localhost/targets (the prometheus job should be UP)http://kibana.localhosthttp://grafana.localhost (admin/password)
http://traefik.localhostrate(prometheus_remote_storage_samples_total[1m])PROMQL step=1m rate(prometheus_remote_storage_samples_total[1m])Note: The
.localhosthostnames are resolved by most browsers to127.0.0.1without any/etc/hostschanges.