doc/administration/settings/import_and_export_settings.md
{{< details >}}
{{< /details >}}
Settings for import- and export-related features.
Before you can import projects from other systems, you must enable the import source for that system.
Only import projects from sources you trust. If you import a project from an untrusted source,
an attacker could steal your sensitive data. For example, an imported project
with a malicious .gitlab-ci.yml file could allow an attacker to exfiltrate group CI/CD variables.
GitLab Self-Managed administrators can reduce their attack surface by disabling import sources they don't need:
To enable the export of projects and their data:
{{< history >}}
{{< /history >}}
[!warning] In GitLab 16.1 and earlier, you should not use direct transfer with scheduled scan execution policies. If using direct transfer, first upgrade to GitLab 16.2 and ensure security policy bots are enabled in the projects you are enforcing.
Migration of groups and projects by direct transfer is disabled by default. To enable migration of groups and projects by direct transfer:
The same setting
is available in the API as the
bulk_import_enabled attribute.
{{< history >}}
export_audit_events. Disabled by default.export_audit_events removed.{{< /history >}}
Enable silent admin exports to prevent audit events when instance administrators trigger a project or group file export or download the export file. Exports from non-administrators still generate audit events.
To enable silent admin project and group file exports:
{{< history >}}
importer_user_mapping. Disabled by default.importer_user_mapping removed.{{< /history >}}
To allow mapping of imported user contributions to administrators:
{{< history >}}
importer_user_mapping_allow_bypass_of_confirmation. Disabled by default.importer_user_mapping_allow_bypass_of_confirmation removed.{{< /history >}}
Prerequisites:
To skip confirmation when administrators reassign placeholder users:
When this setting is enabled, administrators can reassign contributions and memberships to non-bot users with any of the following states:
activebannedblockedblocked_pending_approvaldeactivatedldap_blocked{{< history >}}
{{< /history >}}
To modify the maximum file size for exports in GitLab:
To modify the maximum file size for imports in GitLab:
This setting applies only to repositories imported from a GitLab export file.
If you choose a size larger than the configured value for the web server, you may receive errors. See the troubleshooting section for more details.
For GitLab.com repository size limits, read accounts and limit settings.
{{< history >}}
{{< /history >}}
By default, the maximum remote file size for imports from external object storages (for example, AWS) is 10 GiB.
To modify this setting:
0 for no file size limit.{{< history >}}
{{< /history >}}
By default, the maximum download file size for imports by direct transfer is 5 GiB.
To modify this setting:
0 for no file size limit.{{< history >}}
{{< /history >}}
When you import a project using file exports or direct transfer, you can specify the maximum decompressed file size for imported archives. The default value is 25 GiB.
When you import a compressed file, the decompressed size cannot exceed the maximum decompressed file size limit. If the decompressed size exceeds the configured limit, the following error is returned:
Decompressed archive size validation failed.
To modify this setting:
{{< history >}}
{{< /history >}}
When you import a project, you can specify the maximum time out for decompressing imported archives. The default value is 210 seconds.
To modify the maximum decompressed file size for imports in GitLab:
{{< history >}}
{{< /history >}}
You can specify the maximum number of import jobs that are executed simultaneously for:
The job limit is not applied when importing merge requests because there is a hard-coded limit for merge requests to avoid overloading servers.
The default job limit is:
To modify this setting:
{{< history >}}
{{< /history >}}
Direct transfer exports can consume a significant amount of resources.
To prevent using up the database or Sidekiq processes,
administrators can configure the concurrent_relation_batch_export_limit setting.
The default value is 8 jobs, which corresponds to a
reference architecture for up to 40 RPS or 2,000 users.
If you encounter PG::QueryCanceled: ERROR: canceling statement due to statement timeout errors
or jobs getting interrupted due to Sidekiq memory limits, you might want to reduce this number.
If you have enough resources, you can increase this number to process more concurrent export jobs.
To modify this setting, send an API request to /api/v4/application/settings
with concurrent_relation_batch_export_limit.
For more information, see application settings API.
{{< history >}}
{{< /history >}}
To further manage memory usage and database load, use the relation_export_batch_size setting to control the number of records processed in each batch during export operations.
The default value is 50 records per batch. To modify this setting, send an API request to /api/v4/application/settings with relation_export_batch_size.
For more information, see application settings API.
Help page documentation base url is blocked: execution expiredWhile enabling application settings like import source, you might get a Help page documentation base url is blocked: execution expired
error. To work around this error:
docs.gitlab.com, or the redirect help documentation pages URL, to the
allowlist.