Back to Google Cloud Node

nodejs-storage benchmarking

handwritten/storage/internal-tooling/README.md

0.57.02.0 KB
Original Source

nodejs-storage benchmarking

This is not a supported Google product

This benchmarking script intended for use by Storage client library maintainers to benchmark various workloads and collect metrics in order to improve performance of the library. Currently the benchmarking runs a Write-1-Read-3 workload and measures throughput.

Run example:

This runs 10K iterations of Write-1-Read-3 on 5KiB to 2GiB files, and generates output to a CSV file:

bash
$ cd nodejs-storage
$ npm install
$ cd build/internal-tooling
$ node performanceTest.js --iterations 10000

CLI parameters

ParameterDescriptionPossible valuesDefault
--iterationsnumber of iterations to runany positive integer100
--numthreadsnumber of threads to runany positive integer1
--bucketbucket to upload/download to/fromany string bucket namenodejs-perf-metrics
--smallnumber of bytes for lower bound file sizeany positive integer5120
--largenumber of bytes for upper bound file sizeany positive integer2.147e9
--projectidproject ID to useany string project IDundefined

Workload definition and CSV headers

For each invocation of the benchmark, write a new object of random size between small and large . After the successful write, download the object in full three times. For each of the 4 operations record the following fields:

FieldDescription
Opthe name of the operations (WRITE, READ[{0,1,2}])
ObjectSizethe number of bytes of the object
LibBufferSizeconfigured to use the library default of 100 MiB
Crc32cEnabledwhether crc32c was computed for the operation
MD5Enabledwhether MD5 was computed for the operation
ApiNamedefault to JSON
ElapsedTimeUsthe elapsed time in microseconds the operation took
Statuscompletion state of the operation [OK, FAIL]
AppBufferSizeN/A
CpuTimeUsN/A