doc/ci/jobs/job_execution.md
{{< details >}}
{{< /details >}}
The job execution flow describes how GitLab Runner processes CI/CD jobs from start to finish.
GitLab Runner executes CI/CD jobs after it receives a job, retrieves secrets from a vault (if configured), and prepares the executor. Every CI/CD job executes as a series of sequential steps, with each step running in a separate shell context. The runner:
Prepares the source code for the job:
pre_get_sources_script if it's defined in the configurationgit fetch and other source handling commands, unless the none strategy is configuredpost_get_sources_script if it's defined in the configurationDownloads cached files if cache is configured and the previous step succeeded:
Downloads artifacts from previous jobs if artifact downloading is configured and the previous step succeeded:
Executes the main job scripts if the previous step succeeded:
pre_build_script if it's defined in the configurationbefore_script commands if they're definedscript commandspost_build_script if it's defined in the configurationExecutes after_script commands if they're defined, regardless of whether previous steps failed:
after_script commandsUploads files to cache if cache uploading is configured, regardless of whether previous steps failed:
Uploads artifacts if artifact uploading is configured, regardless of whether previous steps failed:
Uploads referee data if referee uploading is configured, regardless of whether previous steps failed:
Performs cleanup operations if they're configured, regardless of whether previous steps failed:
%%{init: { "fontFamily": "GitLab Sans" }}%%
flowchart TD
accTitle: GitLab CI/CD Job Execution Flow
accDescr: Shows the complete 9-step job execution sequence from source preparation through cleanup operations.
Start([Job Starts]) --> Source[1. Source preparation
<small>Export variables, runs <code>pre_get_sources_script</code>,</small>
<small><code>git fetch</code>, submodules, <code>post_get_sources_script</code>.</small>]
Source --> Cache[2. Download cache
<small>If configured and previous step succeeds.</small>]
Cache --> Artifacts[3. Download artifacts
<small>If configured and previous step succeeds.</small>]
Artifacts --> MainExec[4. Main execution
<small>Export variables, <code>pre_build_script</code>,</small>
<small><code>before_script</code>, <code>script</code>, <code>post_build_script</code>.</small>]
MainExec --> AfterScript[5. <code>after_script</code>
<small>Always runs if defined.</small>
<small>Files created here are included.</small>]
AfterScript --> Critical[⚠️ CRITICAL: <code>after_script</code> runs BEFORE upload stages.]
Critical --> UploadCache[6. Upload cache
<small>Always runs if configured.</small>
<small>Failure may affect job status.</small>]
Critical --> UploadArtifacts[7. Upload artifacts
<small>Always runs if configured.</small>
<small>Failure may affect job status.</small>]
UploadCache --> UploadReferees[8. Upload referees
<small>Always runs if configured.</small>
<small>Failure doesn't affect job status.</small>]
UploadArtifacts --> UploadReferees
UploadReferees --> Cleanup[9. Cleanup operations
<small>Always runs if configured.</small>
<small>Delete file-based variables.</small>]
Cleanup --> End([Job Complete])
Each shell context is isolated by design. The only connection between contexts is the shared working directory file system.
export my_variable=$(date)) in one context are not available in other contextsset -eo pipefail (for Unix shells) to fail early on the first error