docs/plans/benchmark-improvements/006-update-benchmarking-documentation.md
Update docs/benchmarking.md and related documentation to reflect the current state of the benchmark suite, including any changes from tinybench migration and newly added benchmarks.
The docs/benchmarking.md file was created as part of the initial benchmark implementation. It needs to be reviewed and updated to ensure accuracy after:
| Section | Current | Update To |
|---|---|---|
| Overview | References Benchmark.js | Reference tinybench |
| Installation | N/A | Verify dependencies |
| Code examples | Benchmark.js API | tinybench API |
| Troubleshooting | Benchmark.js specific | tinybench specific |
All code examples need updating for tinybench API:
Creating a benchmark:
// Update from Benchmark.Suite to Bench
import { withCodSpeed } from '@codspeed/tinybench-plugin'
import { Bench } from 'tinybench'
const bench = withCodSpeed(new Bench({ name: 'my-benchmarks' }))
bench.add('benchmark name', async () => {
await myAsyncOperation()
})
await bench.run()
console.table(bench.table())
Event handling:
// Update event listener pattern
bench.addEventListener('cycle', (evt) => {
console.log(evt.task?.name, evt.task?.result)
})
| Section | Review Items |
|---|---|
| Overview | Update library name, verify descriptions |
| Benchmark Categories | Ensure all categories are documented |
| Running Benchmarks | Verify commands still work |
| Benchmark Files | Update file list if changed |
| Benchmark Scenarios | Add any new benchmarks |
| Adding New Benchmarks | Update code templates |
| Profiling | Verify profiling commands |
| CI Integration | Review workflow accuracy |
| Data Configuration | Verify seed configs |
| Troubleshooting | Update for tinybench issues |
| Interpreting Results | Update output format |
| Performance Targets | Add baseline measurements (task 005) |
Consider adding:
All code examples should use:
import { withCodSpeed } from '@codspeed/tinybench-plugin'
import { Bench } from 'tinybench'
Replace all instances of:
const suite = withCodSpeed(new Benchmark.Suite('name'))
With:
const bench = withCodSpeed(new Bench({ name: 'name' }))
Replace deferred pattern:
suite.add('name', {
defer: true,
fn: (deferred) => {
asyncOp().then(() => deferred.resolve())
},
})
With native async:
bench.add('name', async () => {
await asyncOp()
})
tinybench uses console.table(bench.table()) which outputs a different format:
┌─────────┬───────────────┬─────────────┬───────────────────┬──────────┬─────────┐
│ (index) │ Task Name │ ops/sec │ Average Time (ns) │ Margin │ Samples │
├─────────┼───────────────┼─────────────┼───────────────────┼──────────┼─────────┤
│ 0 │ 'benchmark1' │ '1,234,567' │ 810.005 │ '±0.50%' │ 617284 │
└─────────┴───────────────┴─────────────┴───────────────────┴──────────┴─────────┘
Document required dependencies:
{
"devDependencies": {
"@codspeed/tinybench-plugin": "5.0.1",
"tinybench": "^4.0.1"
}
}
The benchmarking section in AGENTS.md should also be reviewed:
After updates:
# Verify all documented commands work
pnpm bench
pnpm bench query-performance
pnpm bench compilation
pnpm bench interpreter
# Verify profiling commands work
node --cpu-prof -r esbuild-register packages/client/src/__tests__/benchmarks/query-performance/query-performance.bench.ts