tools/performance/engine-benchmarks/README.md
This directory contains a python script bench_download.py for downloading
Engine and stdlib benchmark results from GitHub, and Engine_Benchs Enso
project for analysing the downloaded data.
Note that for convenience, there is bench_tool directory that is a Python
package. The bench_download.py script uses this package.
To run all the Python tests for that package, run python -m unittest in this
directory.
Dependencies for bench_download.py:
pandas and jinja2
pip install pandas jinja2sudo apt-get install ghCheck bench_download -h for documentation and usage. Ensure that your
/usr/bin/env python links to Python version at least 3.7. bench_download.py
creates generated_site directory with HTML files for visualizing the benchmark
results.
One can also download only a CSV file representing all the selected benchmark
results with bench_download.py --create-csv.
Run local tests with:
python -m unittest --verbose bench_tool/test*.py
Run a single test with:
python -m unittest --verbose bench_tool/test*.py -k <test_name>
The bench_download.py script is used in
Benchmarks Upload
GH Action to download the benchmarks generated by the
Benchmark Engine
and
Benchmark Standard Libraries
GH Actions. The Benchmarks Upload action is triggered by the
engine-benchmark.yml and std-libs-benchmark.yml actions.
The results from the benchmarks are gathered from the GH artifacts associated with corresponding workflow runs, and save as JSON files inside https://github.com/enso-org/engine-benchmark-results repo inside its cache directory.