doc/ci/testing/unit_test_reports.md
{{< details >}}
{{< /details >}}
Unit test reports display test results directly in merge requests and pipeline details, so you can identify failures without searching through job logs.
Use unit test reports when you want to:
Unit test reports require the JUnit XML format and do not affect job status. To make a job fail when tests fail, your job's script must exit with a non-zero status.
GitLab Runner uploads your test results in JUnit XML format as artifacts. When you go to a merge request, your test results are compared between the source branch (head) and target branch (base) to show what changed.
Unit test reports must use JUnit XML format with specific requirements to ensure proper parsing and display.
Your test report files must:
.xml file extension.If you have duplicate test names, only the first test is used and others with the same name are ignored.
For test case limits, see Maximum test cases per unit test report.
GitLab parses a subset of JUnit XML elements and attributes to display test results in the UI.
| XML Element | XML Attribute | Description |
|---|---|---|
testsuites | time | Total execution time for all test suites. Used for test execution time calculations. |
testsuite | name | Test suite name. Parsed for internal grouping. |
testsuite | time | Execution time for an individual test suite. Used for test execution time calculations. |
testcase | classname | Test class or category name. Displayed as the suite name in UI. |
testcase | name | Individual test name. |
testcase | file | File path where the test is defined. |
testcase | time | Test execution time in seconds. |
failure | Element content | Failure message and stack trace. |
error | Element content | Error message and stack trace. |
skipped | Element content | Reason for skipping the test. |
system-out | Element content | System output and attachment tags. Only parsed from testcase elements. |
system-err | Element content | System error output. Only parsed from testcase elements. |
The following elements and attributes are not parsed:
testsuite attributes (tests, failures, errors, timestamp)testcase attributes (assertions, line, status)properties elementssystem-out and system-err at the testsuite level<testsuites>
<testsuite name="Authentication Tests" tests="1" failures="1">
<testcase classname="LoginTest" name="test_invalid_password" file="spec/auth_spec.rb" time="0.23">
<failure>Expected authentication to fail</failure>
[[ATTACHMENT|screenshots/failure.png]]
</testcase>
</testsuite>
</testsuites>
This XML displays in GitLab as:
LoginTest (from testcase classname)test_invalid_password (from testcase name)spec/auth_spec.rb (from testcase file)0.23s (from testcase time)testcase system-out)testsuite name)Test results are compared between the merge request's source and target branches to show what changed:
If branches cannot be compared, for example when there is no target branch data yet, only the failed tests from your branch are shown.
For tests that failed in the default branch in the last 14 days,
you see a message like Failed {n} time(s) in {default_branch} in the last 14 days.
This count includes failed tests from completed pipelines, but not blocked pipelines.
Support for blocked pipelines is proposed in issue 431265.
Configure unit test reports to display test results in merge requests and pipelines.
To configure unit test reports:
.gitlab-ci.yml file, add
artifacts:reports:junit to your test job.artifacts:paths.artifacts:when:always.Example configuration for Ruby with RSpec:
ruby:
stage: test
script:
- bundle install
- bundle exec rspec --format progress --format RspecJunitFormatter --out rspec.xml
artifacts:
when: always
paths:
- rspec.xml
reports:
junit: rspec.xml
You can view test results:
View detailed information about test failures in merge requests.
The Test summary panel shows an overview of your test results, including how many tests failed and passed.
To view test failure details:
The dialog displays the test name, file path, execution time, screenshot attachment (if configured), and error output.
To view all test results:
Copy test names to rerun them locally for debugging.
Prerequisites:
<file> attributes for failed tests.To copy all failed test names:
The failed tests are copied as a space-separated string.
To copy a single failed test name:
The test name is copied to your clipboard.
View all test suites and cases in pipeline details, including results from child pipelines.
To view pipeline test results:
You can also retrieve test reports with the Pipelines API.
Test results display different timing metrics:
Pipeline duration : Elapsed time from when the pipeline starts until it completes.
Test execution time : Total time spent running all tests across all jobs, added together.
Queue time : Time jobs spent waiting for available runners.
When jobs run in parallel, cumulative test execution time can exceed pipeline duration.
Pipeline duration shows how long you wait for results, while test execution time shows compute resources used.
For example, a pipeline that completes in 81 minutes might show 9 hours 10 minutes of test execution time if many test jobs run in parallel across multiple runners.
Add screenshots to test reports to help debug test failures.
To add screenshots to test reports:
In your JUnit XML file, add attachment tags with screenshot paths relative to $CI_PROJECT_DIR:
<testcase time="1.00" name="Test">
[[ATTACHMENT|/path/to/some/file]]
</testcase>
In your .gitlab-ci.yml file, configure your job to upload screenshots as artifacts:
artifacts:when: always to upload screenshots when tests fail.For example:
ruby:
stage: test
script:
- bundle install
- bundle exec rspec --format progress --format RspecJunitFormatter --out rspec.xml
- # Your test framework should save screenshots to a directory
artifacts:
when: always
paths:
- rspec.xml
- screenshots/
reports:
junit: rspec.xml
Run your pipeline.
You can access the screenshot link in the test details dialog when you select View details for a failed test in the Test summary panel.
You might see an empty Test summary panel in merge requests.
This issue occurs when:
To resolve this issue, set a longer expire_in value for the report artifact,
or run a new pipeline to generate a new report.
If JUnit files exceed size limits, ensure:
Support for custom limits is proposed in epic 16374.
You might see fewer test results than expected in your reports.
This can happen when you have duplicate test names in your JUnit XML file. Only the first test for each name is used and duplicates are ignored.
To resolve this issue, ensure all test names and classes are unique.
You might not see the Test summary panel at all in merge requests.
This issue can happen when the target branch has no test data for comparison.
To resolve this issue, run a pipeline on your target branch to generate baseline test data.
You might see parsing error indicators next to job names in your pipeline.
This can happen when JUnit XML files contain formatting errors or invalid elements.
To resolve this issue:
For grouped jobs, only the first parsing error from the group is displayed.