RELEASE_NOTES.rst
.. Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to you under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
.. http://www.apache.org/licenses/LICENSE-2.0
.. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
.. contents:: Apache Airflow Releases :local: :depth: 1
.. note:: Release notes for older versions can be found in the versioned documentation.
.. towncrier release notes start
Significant Changes ^^^^^^^^^^^^^^^^^^^
/dags endpoint, as it now requires additional permissions (DagAccessEntity.RUN, DagAccessEntity.HITL_DETAIL, and DagAccessEntity.TASK_INSTANCE). This change was made because the endpoint returns aggregated data from these multiple entities. Please update your custom user roles to include read access for DAG Runs, Task Instances, and HITL Details if those users should still have access to the /dags endpoint. (#64822)Improvements ^^^^^^^^^^^^
{} to restore OSS defaults. The tokens field is now optional in the theme configuration. (#64552)Bug Fixes ^^^^^^^^^
DEFAULT_LOGGING_CONFIG to use right kwargs (#65412) (#65424)dispose_orm() not disposing async engine on shutdown (#65274) (#65284)get_team_name_dep creating wasted async sessions when multi_team=False (#65275) (#65282)disable_sqlite_fkeys to migration 0108 (#65288) (#65290)UPDATE to avoid row lock in the common case (#65029) (#65137)dropdowns in connection forms (#65007) (#65085) (#65138)SearchBar value not syncing with defaultValue changes (#65054) (#65140)$AIRFLOW_CONFIG env (#64936) (#65200)Session staying opened between yields (#65179) (#65195)Session leak from StreamingResponse API endpoints (#65162) (#65193)_token cookie exists from older Airflow instance (#64955) (#65177)@task decorator to validate operator arg types at decoration time (#65041) (#65050)is_alive default to None in jobs list CLI (#65065) (#65091)dag_id in get_task_instance (#64957) (#64968) (#65067)debounce on clear to prevent stale search value (#64893) (#64907)CommsDecoder (#64894) (#64946)UPDATEs inside disable_sqlite_fkeys in migration 0097 (#64876) (#64940)TI exists in TIH (#61631) (#64693)SerializedDagModel (#64322) (#64738)TypeError in GET /dags/{dag_id}/tasks when order_by field has None values (#64384) (#64587)DagRun (#64752) (#64853)connections import returning non-zero exit code on failure (#64416) (#64449)target and add rel attributes (#64542) (#64772)DagVersionSelect options not filtered by selected DagRun (#64736) (#64771)start_date in example DAGs to avoid timezone conversion overflow (#63882) (#64758)AirflowPlugin not re-exported, causing mypy errors in plugins (#65132) (#65163)apache-airflow-providers-fab minimum version to prevent connexion import error on Python 3.13 (#65523) (#65524)Miscellaneous ^^^^^^^^^^^^^
TriggerCommsDecoder sync req-res cycle (#64882) (#65285)write_to_os support for writing task logs to OpenSearch (#64364) (#65201)airflow_local_settings.py (#64764) (#65003)Doc-only Changes ^^^^^^^^^^^^^^^^
Significant Changes ^^^^^^^^^^^^^^^^^^^
Asset Partitioning """"""""""""""""""
The headline feature of Airflow 3.2.0 is asset partitioning — a major evolution of data-aware scheduling. Instead of triggering Dags based on an entire asset, you can now schedule downstream processing based on specific partitions of data. Only the relevant slice of data triggers downstream work, making pipeline orchestration far more efficient and precise.
This matters when working with partitioned data lakes — date-partitioned S3 paths, Hive table partitions, BigQuery table partitions, or any other partitioned data store. Previously, any update to an asset triggered all downstream Dags regardless of which partition changed. Now only the right work gets triggered at the right time.
For detailed usage instructions, see :doc:/authoring-and-scheduling/assets.
Multi-Team Deployments """"""""""""""""""""""
Airflow 3.2 introduces multi-team support, allowing organizations to run multiple isolated teams within a single Airflow deployment. Each team can have its own Dags, connections, variables, pools, and executors— enabling true resource and permission isolation without requiring separate Airflow instances per team.
This is particularly valuable for platform teams that serve multiple data engineering or data science teams from shared infrastructure, while maintaining strong boundaries between teams' resources and access.
For detailed usage instructions, see :doc:/core-concepts/multi-team.
.. warning::
Multi-Team Deployments are experimental in 3.2.0 and may change in future versions based on user feedback.
Synchronous callback support for Deadline Alerts """"""""""""""""""""""""""""""""""""""""""""""""
Deadline Alerts now support synchronous callbacks via SyncCallback in addition to the existing
asynchronous AsyncCallback. Synchronous callbacks are executed by the executor (rather than
the triggerer), and can optionally target a specific executor via the executor parameter.
A Dag can also define multiple Deadline Alerts by passing a list to the deadline parameter,
and each alert can use either callback type.
.. warning::
Deadline Alerts are experimental in 3.2.0 and may change in future versions based on
user feedback. Synchronous deadline callbacks (SyncCallback) do not currently
support Connections stored in the Airflow metadata database.
For detailed usage instructions, see :doc:/howto/deadline-alerts.
UI Enhancements & Performance """""""""""""""""""""""""""""
Grid View Virtualization: The Grid view now uses virtualization -- only visible rows are rendered to the DOM. This dramatically improves performance when viewing Dags with large numbers of task runs, reducing render time and memory usage for complex Dags. (#60241)
XCom Management in the UI: You can now add, edit, and delete XCom values directly from the Airflow UI. This makes it much easier to debug and manage XCom state during development and day-to-day operations without needing CLI commands. (#58921)
HITL Detail History: The Human-in-the-Loop approval interface now includes a full history view, letting operators and reviewers see the complete audit trail of approvals and rejections for any task. (#56760, #55952)
Gantt Chart Improvements:
New --only-idle flag for the scheduler CLI
"""""""""""""""""""""""""""""""""""""""""""""""
The airflow scheduler command has a new --only-idle flag that only counts runs when the
scheduler is idle. This helps users run the scheduler once and process all triggered Dags and
queued tasks. It requires and complements the --num-runs flag so one can set a small value
instead of guessing how many iterations the scheduler needs.
Replace per-run TI summary requests with a single NDJSON stream """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The grid, graph, gantt, and task-detail views now fetch task-instance
summaries through a single streaming HTTP request
(GET /ui/grid/ti_summaries/{dag_id}?run_ids=...) instead of one request
per run. The server emits one JSON line per run as soon as that run's task
instances are ready, so columns appear progressively rather than all at once.
What changed:
GET /ui/grid/ti_summaries/{dag_id}?run_ids=... is now the sole endpoint
for TI summaries, returning an application/x-ndjson stream where each
line is a serialized GridTISummaries object for one run.GET /ui/grid/ti_summaries/{dag_id}/{run_id}
has been removed.dag_version_id, avoiding redundant deserialization.run_ids.Structured JSON logging for all API server output """""""""""""""""""""""""""""""""""""""""""""""""
The new json_logs option under the [logging] section makes Airflow
produce all its output as newline-delimited JSON (structured logs) instead of
human-readable formatted logs. This covers the API server (gunicorn/uvicorn),
including access logs, warnings, and unhandled exceptions.
Not all components support this yet — notably airflow celery worker but
any non-JSON output when json_logs is enabled will be treated as a bug. (#63365)
Remove legacy OTel Trace metaclass and shared tracer wrappers """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The interfaces and functions located in airflow.traces were
internal code that provided a standard way to manage spans in
internal Airflow code. They were not intended as user-facing code
and were never documented. They are no longer needed so we
remove them in 3.2. (#63452)
Move task-level exception imports into the Task SDK """""""""""""""""""""""""""""""""""""""""""""""""""
Airflow now sources task-facing exceptions (AirflowSkipException, TaskDeferred, etc.) from
airflow.sdk.exceptions. airflow.exceptions still exposes the same exceptions, but they are
proxies that emit DeprecatedImportWarning so Dag authors can migrate before the shim is removed.
What changed:
airflow-core at runtime.airflow.providers.common.compat.sdk centralizes compatibility imports for providers.Behaviour changes:
ValueError (instead of
AirflowException) when poke_interval/ timeout arguments are invalid.airflow.exceptions logs a warning directing users to
the SDK import path.Exceptions now provided by airflow.sdk.exceptions:
AirflowException and AirflowNotFoundExceptionAirflowRescheduleException and AirflowSensorTimeoutAirflowSkipException, AirflowFailException, AirflowTaskTimeout, AirflowTaskTerminatedTaskDeferred, TaskDeferralTimeout, TaskDeferralErrorDagRunTriggerException and DownstreamTasksSkippedAirflowDagCycleException and AirflowInactiveAssetInInletOrOutletExceptionParamValidationError, DuplicateTaskIdFound, TaskAlreadyInTaskGroup, TaskNotFound, XComNotFoundAirflowOptionalProviderFeatureExceptionBackward compatibility:
airflow.exceptions continue to work, though
they log warnings.airflow.providers.common.compat.sdk to keep one import path that works
across supported Airflow versions.Migration:
airflow.sdk.exceptions (or from the provider compat shim).ValueError for invalid sensor arguments if it
previously caught AirflowException.Support numeric multiplier values for retry_exponential_backoff parameter """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The retry_exponential_backoff parameter now accepts numeric values to specify custom exponential backoff multipliers for task retries. Previously, this parameter only accepted boolean values (True or False), with True using a hardcoded multiplier of 2.0.
New behavior:
2.0, 3.5) directly specify the exponential backoff multiplierretry_exponential_backoff=2.0 doubles the delay between each retry attemptretry_exponential_backoff=0 or False disables exponential backoff (uses fixed retry_delay)Backwards compatibility:
Existing Dags using boolean values continue to work:
retry_exponential_backoff=True → converted to 2.0 (maintains original behavior)retry_exponential_backoff=False → converted to 0.0 (no exponential backoff)API changes:
The REST API schema for retry_exponential_backoff has changed from type: boolean to type: number. API clients must use numeric values (boolean values will be rejected).
Migration:
While boolean values in Python Dags are automatically converted for backwards compatibility, we recommend updating to explicit numeric values for clarity:
retry_exponential_backoff=True → retry_exponential_backoff=2.0retry_exponential_backoff=False → retry_exponential_backoff=0Move serialization/deserialization (serde) logic into Task SDK """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Airflow now sources serde logic from airflow.sdk.serde instead of
airflow.serialization.serde. Serializer modules have moved from airflow.serialization.serializers.*
to airflow.sdk.serde.serializers.*. The old import paths still work but emit DeprecatedImportWarning
to guide migration. The backward compatibility layer will be removed in Airflow 4.
What changed:
airflow-core to task-sdk packageairflow.serialization.serializers.* to airflow.sdk.serde.serializers.*airflow.sdk.serde.serializers.* namespaceCode interface changes:
airflow.sdk.serde.serializers.* instead of airflow.serialization.serializers.*airflow.sdk.serde instead of airflow.serialization.serdeBackward compatibility:
airflow.serialization.serializers.* continue to work with deprecation warningsMigration:
airflow.sdk.serde.serializers.*airflow.sdk.serde.serializers.* namespace (e.g., create task-sdk/src/airflow/sdk/serde/serializers/your_serializer.py)Methods removed from PriorityWeightStrategy """"""""""""""""""""""""""""""""""""""""""""
On (experimental) class PriorityWeightStrategy, functions serialize()
and deserialize() were never used anywhere, and have been removed. They
should not be relied on in user code. (#59780)
Methods removed from TaskInstance """""""""""""""""""""""""""""""""
On class TaskInstance, functions run(), render_templates(),
get_template_context(), and private members related to them have been
removed. The class has been considered internal since 3.0, and should not be
relied on in user code. (#59780, #59835)
Modify the information returned by DagBag
"""""""""""""""""""""""""""""""""""""""""""""
New behavior:
DagBag now uses Path.relative_to for consistent cross-platform behavior.FileLoadStat now has two additional nullable fields: bundle_path and bundle_name.Backward compatibility:
FileLoadStat will no longer produce paths beginning with / with the meaning of "relative to the dags folder".
This is a breaking change for any custom code that performs string-based path manipulations relying on this behavior.
Users are advised to update such code to use pathlib.Path. (#59785)
Remove --conn-id option from airflow connections list
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The redundant --conn-id option has been removed from the airflow connections list CLI command.
Use airflow connections get instead. (#59855)
Add operator-level render_template_as_native_obj override
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Operators can now override the Dag-level render_template_as_native_obj setting,
enabling fine-grained control over whether templates are rendered as native Python
types or strings on a per-task basis. Set render_template_as_native_obj=True or
False on any operator to override the Dag setting, or leave as None (default)
to inherit from the Dag.
Add gunicorn support for API server with zero-downtime worker recycling """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The API server now supports gunicorn as an alternative server with rolling worker restarts to prevent memory accumulation in long-running processes.
Key Benefits:
Rolling worker restarts: New workers spawn and pass health checks before old workers are killed, ensuring zero downtime during worker recycling.
Memory sharing: Gunicorn uses preload + fork, so workers share memory via copy-on-write. This significantly reduces total memory usage compared to uvicorn's multiprocess mode where each worker loads everything independently.
Correct FIFO signal handling: Gunicorn's SIGTTOU kills the oldest worker (FIFO), not the newest (LIFO), which is correct for rolling restarts.
Configuration:
.. code-block:: ini
[api]
# Use gunicorn instead of uvicorn
server_type = gunicorn
# Enable rolling worker restarts every 12 hours
worker_refresh_interval = 43200
# Restart workers one at a time
worker_refresh_batch_size = 1
Or via environment variables:
.. code-block:: bash
export AIRFLOW__API__SERVER_TYPE=gunicorn
export AIRFLOW__API__WORKER_REFRESH_INTERVAL=43200
Requirements:
Install the gunicorn extra: pip install 'apache-airflow-core[gunicorn]'
Note on uvicorn (default):
The default uvicorn mode does not support rolling worker restarts because:
If you need worker recycling or memory-efficient multi-worker deployment, use gunicorn. (#60921)
Improved performance of rendered task instance fields cleanup for Dags with many mapped tasks (~42x faster) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The config max_num_rendered_ti_fields_per_task is renamed to num_dag_runs_to_retain_rendered_fields
(old name still works with deprecation warning).
Retention is now based on the N most recent dag runs rather than N most recent task executions, which may result in fewer records retained for conditional/sparse tasks. (#60951)
AuthManager Backfill permissions are now handled by the requires_access_dag on the DagAccessEntity.Run
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
is_authorized_backfill of the BaseAuthManager interface has been removed. Core will no longer call this method and their
provider counterpart implementation will be marked as deprecated.
Permissions for backfill operations are now checked against the DagAccessEntity.Run permission using the existing
requires_access_dag decorator. In other words, if a user has permission to run a Dag, they can perform backfill operations on it.
Please update your security policies to ensure that users who need to perform backfill operations have the appropriate DagAccessEntity.Run permissions. (Users
having the Backfill permissions without having the DagRun ones will no longer be able to perform backfill operations without any update)
Python 3.14 support added """""""""""""""""""""""""
Airflow 3.2.0 adds support for Python 3.14. (#63787)
Reduce API server memory by eliminating SerializedDAG loads on task start
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The API server no longer loads the full SerializedDAG when starting tasks,
significantly reducing memory usage. (#60803)
Remove MySQL client from container images """""""""""""""""""""""""""""""""""""""""
MySQL client support has been removed from official Airflow container images. MySQL users building on official images must install the client themselves. (#57146)
Add support for async callables in PythonOperator """""""""""""""""""""""""""""""""""""""""""""""""
The PythonOperator parameter python_callable now also supports async callables in Airflow 3.2,
allowing users to run async def functions without manually managing an event loop. (#60268)
Make start_date optional for @continuous schedule
"""""""""""""""""""""""""""""""""""""""""""""""""
The schedule="@continuous" parameter now works without requiring a start_date, and any Dags with this schedule will begin running immediately when unpaused. (#61405)
Upgrade SQLAlchemy (SQLA) to 2.0 """"""""""""""""""""""""""""""""""""""""""""""""" Airflow now only support SQLAlchemy version 2
New Features ^^^^^^^^^^^^
PYTHON_LTO build argument (#58337)--queues CLI option for the trigger command (#59239)--show-values and --hide-sensitive flags to CLI connections list and variables list to hide sensitive values by default (#62344)AIRFLOW__SECRETS__BACKEND_KWARG__<KEY> environment variables (#63312)only_new parameter to Dag clear to only clear newly added task instances (#59764)log_timestamp_format config option for customizing component log timestamps (#63321)--action-on-existing-key option to pools import and connections import CLI commands (#62702)--use-migration-files flag for airflow db init (#62234)AllowedKeyMapper for partition key validation in asset partitioning (#61931)ChainMapper for chaining multiple partition mappers (#64094)AgenticOperator (#63081)@task.stub decorator to allow tasks in other languages to be defined in Dags (#56055)TriggerDagRunOperator (#60810)allowed_run_types to whitelist specific Dag run types (#61833)dag_id and dag_run_id in bulk task instance endpoint (#57441)operator_name_pattern, pool_pattern, queue_pattern as task instance search filters (#57571)update_mask support for bulk PATCH APIs (#54597)source parameter to Param (#58615)TaskInstance on RuntimeTaskInstance (#59712)max_trigger_to_select_per_loop config for Triggerer HA setup (#58803)uvicorn_logging_level config option to control API server access logs (#56062)executor.running_dags gauge metric to expose count of running Dags (#52815)GitDagBundle (#59911)GitHook for Dag bundles (#58194)RemoteIO for ObjectStorage (#54813)--dev flag (#57741)auth list-envs command to list CLI environments and auth status (#61426)airflow info command output (#59124)db_clean to explicitly include or exclude Dags (#56663)TaskStreamFilter (#60549)globalCss in custom themes (#61161)run_after date filter on Dag runs page (#62797)AIRFLOW__API__THEME in addition to brand (#64232)Bug Fixes ^^^^^^^^^
non-sensitive-only value as True (#59880)InvalidStatsNameException for pool names with invalid characters by auto-normalizing them when emitting metrics (#59938)AIRFLOW__API__BASE_URL basename is configured (#63141)/mapped to group URLs (#63205)ti_skip_downstream overwriting RUNNING tasks to SKIPPED in HA deployments (#63266)@task decorator failing for tasks that return falsy values like 0 or empty string (#63788)LatestOnlyOperator not working when direct upstream of a dynamically mapped task (#62287)XCom return type in mapped task groups with dynamic mapping (#59104)DagRun span emission crash when context_carrier is None (#64087)next_dagrun fields are None (#63962)relativedelta (#61671)task_instance_mutation_hook receiving run_id=None during TaskInstance creation (#63049)None dag_version access (#62225)MetastoreBackend.expunge_all() corrupting shared session state (#63080)airflowignore negation pattern handling for directory-only patterns (#62860)TYPE_CHECKING-only forward references in TaskFlow decorators (#63053)structlog JSON serialization crash on non-serializable objects (#62656)queued_tasks type mismatch in hybrid executors (CeleryKubernetesExecutor, LocalKubernetesExecutor) (#63744)pathlib.Path objects incorrectly resolved by Jinja templater in Task SDK (#63306)make_partial_model for API Pydantic models (#63716)_execution_api_server_url() ignoring configured value and falling back to edge config (#63192)DetachedInstanceError for airflow tasks render command (#63916)@task definition causing Dag parsing errors (#62174)limit parameter not sent in execute_list server requests (#63048)airflow.configuration causing ImportError on Python 3.14 (#63787)map_index range validation in CLI commands (#62626)FabAuthManager race condition on startup with multiple workers (#62737)FabAuthManager race condition when workers concurrently create permissions, roles, and resources (#63842)JWTValidator not handling GUESS algorithm with JWKS (#63115)FabAuthManager first idle MySQL disconnect in token auth (#62919)JWTBearerTIPathDep import errors in Human-In-The-Loop routes (#63277)log_pos (#63531)null dag_run_conf causing serialization error in BackfillResponse (#63259)savepoints with per-Dag transactions (#63591)deadline.callback_id (#63612)interval causing query failures in deadline_alert (#63494)serialize_dag query failure during deadline migration (#63804)visibility_timeout that kills long-running Celery tasks (#62869)CronPartitionedTimetable (#62441)AssetModel when updating asset partition DagRun — adds mutex lock (#59183)auth_manager load_user causing PendingRollbackError (#61943)joinedload for asset in dags_needing_dagruns() (#60957)NotMapped exception when clearing task instances with downstream/upstream (#58922)partition_key filter in PALK when creating DagRun (#61831)ObjectStoragePath to exclude conn_id from storage options passed to fsspec (#62701)XComObjectStorageBackend (#55805)default_email_on_failure/default_email_on_retry from config (#59912)TaskInstance.get_dagrun returning None in task_instance_mutation_hook (#60726)dag_display_name property bypass for DagStats query (#64256)TaskAlreadyRunningError not raised when starting an already-running task instance (#60855)enable_swagger_ui config not respected in API server (#64376)conf.has_option not respects default provider metadata (#64209)TaskInstance crash when refreshing task weight for non-serialized operators (#64557)XCom edit modal value not repopulating on reopen (#62798)RenderedJsonField collapse behavior (#63831)RenderedJsonField not displaying in table cells (#63245)-1) slots not rendering correctly (#62831)DurationChart labels and disable animation flicker during auto-refresh (#62835)/dagruns page not working (#62848)total_received count in partitioned Dag runs view (#62786)RenderedJsonField flickering when collapsed (#64261)TISummaries not refreshing when gridRuns are invalidated (#64113)DagRun window (#64179)Miscellaneous ^^^^^^^^^^^^^
sqlalchemy[asyncio]>=2.0.48api.page_size config in favor of api.fallback_page_limit (#61067)get_dag_runs API endpoint performance (#63940)filter_authorized_dag_ids (#63184)gc.freeze (#62212)get_task_instances endpoint (#62910)IN clause in asset queries with CTE and JOIN for better SQL performance (#62114)ConnectionResponse serializer safeguard to prevent accidental sensitive field exposure (#63883)dag_id filter on DagRun task instances API query (#62750)expose_stacktrace is disabled (#63028)update_mask fields in PATCH API endpoints against Pydantic models (#62657)order_by parameter to GET /permissions endpoint for pagination consistency (#63418)BaseXcom to airflow.sdk public exports (#63116)TaskSDK conf respect default config from provider metadata (#62696)airflow.sdk.observability.trace (#63554)external_executor_id on PostgreSQL (#63625)DagRun.created_at during migration for faster upgrades (#63825)ExecuteCallback by including dag_id and run_id (#62616)get_connection_form_widgets and get_ui_field_behaviour hook methods (#63711)[workers] config section (#63659)TaskInstance API for external task management (#61568)airflow.datasets, airflow.timetables.datasets, and airflow.utils.dag_parsing_context modules (#62927)PyOpenSSL from core dependencies (#63869)SerializedDAG (#56694)pop(0) to popleft() (#61376).git folder from versions in GitDagBundle to reduce storage size (#57069)airflow.utils.process_utils (#57193)KubernetesPodOperator handling of deleted pods between polls (#56976)ToXXXMapper to StartOfXXXMapper in partition-mapper for clarity (#64160)partition_date (#62866)AIRFLOW__API__THEME config (#64232)DeadlineReferences (#57222)FilterBar with DateRangeFilter for compact UI (#56173)Doc Only Changes ^^^^^^^^^^^^^^^^
RedisTaskHandler configuration example (#63898)_shared folders (#63468)modules_management docs (#63634)max_active_tasks Dag parameter documentation (#63217)GitHook parameters (#63265)example_bash_decorator (#62948)Significant Changes ^^^^^^^^^^^^^^^^^^^
Backfill permissions are now handled via DagAccessEntity.Run (#61456)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
is_authorized_backfill of the BaseAuthManager interface has been removed. Core will no longer call this method and their
provider counterpart implementation will be marked as deprecated.
Permissions for backfill operations are now checked against the DagAccessEntity.Run permission using the existing
requires_access_dag decorator. In other words, if a user has permission to run a DAG, they can perform backfill operations on it.
Please update your security policies to ensure that users who need to perform backfill operations have the appropriate DagAccessEntity.Run permissions. (Users
having the Backfill permissions without having the DagRun ones will no longer be able to perform backfill operations without any update)
Elasticsearch is now fully compatible with remote logging along (#62940)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Elasticsearch is now fully compatible with remote logging along side with apache-airflow-providers-elasticsearch>=6.5.0. Please review elasticsearch provider release notes for more information https://airflow.apache.org/docs/apache-airflow-providers-elasticsearch/6.5.0/changelog.html (#62121) (#62940)
Bug Fixes ^^^^^^^^^
disable_sqlite_fkeys in revision 509b94a1042d (#63256) (#63272)useAssetServiceGetDagAssetQueuedEvents to get the correct number of ADRQs (#62868) (#62902)dag_processing.total_parse_time metric (#62128) (#62764)timer.duration unit labels in logs (#61824) (#62757)dag_bundle.signed_url_template from varchar(200) to text (#61041) (#62568)PYTHONASYNCIODEBUG=1 is set (#61281) (#61933)pendulum.date.Date values (#61176) (#61717)access_key and connection_string not being masked in logs (#61580) (#61582)minimatch ReDoS vulnerabilities via pnpm overrides (#62805)elk.portConstraints for LR orientation in graph view (#62144) (#62187)Miscellaneous ^^^^^^^^^^^^^
run_after alias to XComResponse for backward compatibility (#61443) (#61672)Doc Only Changes ^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes ^^^^^^^^^
TriggerDagRunOperator deferring when wait_for_completion=False (#60052)gc.freeze (#60505) (#60845)externalLogUrl (#60412) (#60479)buttongroups (#60298) (#60337)viewport height (#59660) (#60286)Miscellaneous ^^^^^^^^^^^^^
is_default_pool in Pool model (#61084) (#61128)Taiwaness Mandarin (#61126), Catalan (#61093), German (#61097), Polish (#61099),
Arabic (#60635 #60782, (#60635) (#60782)), Spanish (#60775 #60785, (#60775) (#60785)),
Hebrew (#60633 #60686, (#60633) (#60686))Doc Only Changes ^^^^^^^^^^^^^^^^
Significant Changes ^^^^^^^^^^^^^^^^^^^
is_authorized_hitl_task() method now available in auth managers(#59399).
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
This method is now available in auth managers to check whether a user is authorized to approve a HITL task
proxy and proxies added to DEFAULT_SENSITIVE_FIELDS (#59688)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
proxy and proxies have been added to DEFAULT_SENSITIVE_FIELDS in secrets_masker to treat proxy configurations as sensitive by default
Bug Fixes ^^^^^^^^^
deprecated_options entry for dag_file_processor_timeout (#59181) (#60162)ApprovalOperator with SimpleAuthManager when all_admins=True (#59399) (#60116)ti_failure metrics for tasks (#59731) (#59964)TaskInstanceHistory on scheduler TI resets (#59639) (#59752)proxy and proxies as sensitive fields in DEFAULT_SENSITIVE_FIELDS (#59688) (#59792)[webserver] base_url (#59659) (#59781)DagRunContext (#59714) (#59732)Content-Type to request headers in Task SDK calls when missing (#59676) (#59687)_read_from_logs_server when status_code is 403 (#59489) (#59504)run_on_latest_version defaulting to False instead of True (#59304) (#59328).airflowignore negation not working in subfolders (#58740) (#59305)DagRun.queued_at not updating when clearing (#59066) (#59177)Miscellaneous ^^^^^^^^^^^^^
Doc Only Changes ^^^^^^^^^^^^^^^^
0.3.0 (#59538)permalink icon (#58763)get_template_context (#59023) (#59036)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes ^^^^^^^^^
gc.freeze (#58934)pre-AIP-39 DAG runs (#58773)dag.test() (#58266)dayjs correctly (#57880)endDate is not null (#58435)parseStreamingLogContent for non-string data (#58399)Miscellaneous ^^^^^^^^^^^^^
.pyc and .pyo files after building Python (#58947)Doc Only Changes ^^^^^^^^^^^^^^^^
Significant Changes ^^^^^^^^^^^^^^^^^^^
Fix Connection & Variable access in API server contexts (plugins, log handlers)(#56583) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Previously, hooks used in API server contexts (plugins, middlewares, log handlers) would fail with an ImportError
for SUPERVISOR_COMMS, because SUPERVISOR_COMMS only exists in task runner child processes.
This has been fixed by implementing automatic context detection with three separate secrets backend chains:
Context Detection:
SUPERVISOR_COMMS presence_AIRFLOW_PROCESS_CONTEXT=server environment variableBackend Chains:
EnvironmentVariablesBackend → ExecutionAPISecretsBackend (routes to Execution API via SUPERVISOR_COMMS)EnvironmentVariablesBackend → MetastoreBackend (direct database access)EnvironmentVariablesBackend only (+ external backends from config like AWS Secrets Manager, Vault)The fallback chain is crucial for supervisor processes (worker-side, before task runner starts) which need to access
external secrets for remote logging setup but should not use MetastoreBackend (to maintain worker isolation).
Architecture Benefits:
MetastoreBackend, maintaining strict isolationImpact:
GCSHook, S3Hook now work correctly in log handlers and pluginsSee: #56120 <https://github.com/apache/airflow/issues/56120>, #56583 <https://github.com/apache/airflow/issues/56583>, #51816 <https://github.com/apache/airflow/issues/51816>__
Remove insecure dag reports API endpoint that executed user code in API server (#56609) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The /api/v2/dagReports endpoint has been removed because it loaded user DAG files directly in the API server process,
violating Airflow's security architecture. This endpoint was not used in the UI and had no known consumers.
Use the airflow dags report CLI command instead for DAG loading reports.
Bug Fixes ^^^^^^^^^
healthcheck timeout not respecting worker-timeout CLI option (#57731) (#57854)Miscellaneous ^^^^^^^^^^^^^
Doc Only Changes ^^^^^^^^^^^^^^^^
Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes ^^^^^^^^^
DagProcessorManager for bundle initialization (#57459)triggering_user_name context variable (#56193)ObjectStoragePath (#57156)default_args (#57397)Miscellaneous ^^^^^^^^^^^^^
XCom viewer and standardize task instance columns (#57447)retryhttp to tenacity library (#56762)Content-Type header to Task SDK API requests (#57386)task_display_name alias in event log API responses (#57609)Doc Only Changes ^^^^^^^^^^^^^^^^
instance_name in UI docs (#57523)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes ^^^^^^^^^
dag_run.conf during upgrades from earlier versions (#56729)retry_delay is None (#56236)generate_run_id not called for manual triggers (#56699)KeyError when accessing retry_delay on MappedOperator without explicit value (#56605)task-sdk connection error handling to match airflow-core behavior (#56653)get_ti_count and get_task_states access in callback requests (#56860)Connection or Variable access in Server context (#56602).airflowignore order precedence (#56832)--dag_run_conf in airflow dags backfill CLI (#56599)'root' causes blue screen on hover (#56926)Day-of-Month and Day-of-Week conflicts (#56255)SerializedDagModel query optimization (#56938)url_prefix (#55262)max_retry_delay to MappedOperator model (#56951)@asset decorator when fetching the asset (#56611)Miscellaneous ^^^^^^^^^^^^^
DISTINCT for dag_version_id lookup (#56565)PodGenerator for deserialization (#56733)action_on_existence (#56672)CreateAssetEventsBody to Pydantic v2 ConfigDict (#56772)active_runs_limit check (#56922)is_favorite to UI dags list (#56341)executor, hostname, and queue columns to TaskInstances page (#55922)XComs page (#56285)ndjson (#56480)sla_miss_callback (#56127)natsort dependency to airflow-core (#56582)babel dependency in Task SDK (#56592)dagReports API endpoint (#56621)Doc Only Changes ^^^^^^^^^^^^^^^^
triggering_asset_event retrieval documentation in DAGs (#56957)Significant Changes ^^^^^^^^^^^^^^^^^^^
Human in the Loop (HITL) """"""""""""""""""""""""
Airflow 3.1 introduces :doc:Human-in-the-Loop (HITL) </tutorial/hitl> functionality that enables
workflows to pause and wait for human decision-making. This powerful feature is particularly valuable for
AI/ML workflows, content moderation, and approval processes where human judgment is essential.
HITL tasks pause execution in a deferred state while waiting for human input via the Airflow UI. Users
with appropriate roles can see pending tasks, review context (including XCom data and DAG parameters), and
complete actions through intuitive web forms. The feature also supports API-driven interactions for custom
UIs and notification integration.
For detailed usage instructions, see :doc:/tutorial/hitl.
Note: HITL operators require apache-airflow-providers-standard package and Airflow 3.1+.
Task SDK Decoupling for Independent Upgrades """""""""""""""""""""""""""""""""""""""""""""
Airflow 3.1 advances the decoupling of the Task SDK from Airflow Core through improved DAG serialization with versioned contracts. While complete code separation is planned for Airflow 3.2.0, the serialization foundation enables independent upgrades when components are deployed separately.
For DAG Authors: Import constructs from airflow.sdk namespace:
from airflow.sdk import DAG, task, assetFor Platform Teams: Foundation for independent upgrades:
For technical details on the serialization contract, see :doc:/administration-and-deployment/dag-serialization.
Deadline Alerts """""""""""""""
Deadline Alerts provide proactive monitoring for DAG execution by automatically triggering notifications when time thresholds are exceeded. This helps ensure SLA compliance and timely completion of critical workflows.
Configure deadline monitoring by specifying:
Example use cases:
Current Limitations: Deadline Alerts currently support only asynchronous callbacks (AsyncCallback).
Support for synchronous callbacks (SyncCallback) is planned for a future release.
For configuration details and examples, see :doc:/howto/deadline-alerts.
.. warning::
Deadline Alerts are experimental in 3.1 and may change in future versions based on user feedback.
UI Internationalization """""""""""""""""""""""
Airflow 3.1 delivers comprehensive internationalization (i18n) support, making the web interface
accessible to users worldwide. The React-based UI now supports 17 languages with robust translation infrastructure.
Supported Languages:
The translation system includes automated completeness checking and clear contribution guidelines for community translators.
React Plugin System (AIP-68) """""""""""""""""""""""""""""
Airflow 3.1 introduces a modern plugin architecture enabling rich integrations through React components and external views. This extensibility framework allows organizations to embed custom dashboards, monitoring tools, and domain-specific interfaces directly within the Airflow UI.
New Plugin Capabilities:
Developer Experience:
airflow-react-plugin dev toolsThis plugin system replaces legacy Flask-based approaches with modern web standards, improving performance, maintainability, and user experience.
For more details and examples, see :doc:/howto/custom-view-plugin.
Enhanced UI Views and Filtering """"""""""""""""""""""""""""""""
Airflow 3.1 brings significant UI improvements including rebuilt Calendar and Gantt chart views for the modern React UI, comprehensive filtering capabilities, and a refreshed visual design system.
Visual Design Improvements
The UI now features an updated color palette leveraging Chakra UI semantic tokens, providing better consistency, accessibility, and theme support across the interface. This modernization improves readability and creates a more cohesive visual experience throughout Airflow.
Rebuilt Views and Enhanced Filtering
The Calendar and Gantt views from Airflow 2.x have been rebuilt for the modern React UI, along with enhanced filtering capabilities across all views. These improvements provide better performance and a more consistent user experience with the rest of the modern Airflow interface.
DAG Dashboard Organization
Users can now pin and favorite DAGs for better dashboard organization, making it easier to find and prioritize frequently used workflows. This feature is particularly valuable for teams managing large numbers of DAGs, providing quick access to critical workflows without searching through extensive DAG lists.
Inference Execution (Synchronous DAGs) """"""""""""""""""""""""""""""""""""""
Airflow 3.1 introduces a new streaming API endpoint that allows applications to watch DAG runs until completion, enabling more responsive integration patterns for real-time and inference workflows.
New Streaming Endpoint:
The /dags/{dag_id}/dagRuns/{dag_run_id}/wait endpoint repeatedly emits JSON updates at specified intervals until the DAG run reaches a finished state.
.. code-block:: bash
# Watch a DAG run with 2-second polling interval, including XCom results
curl -X GET "http://localhost:8080/api/v2/dags/ml_pipeline/dagRuns/manual_2024_01_15/wait?result=inference_task" \
-H "Accept: application/x-ndjson"
This enables use cases like:
New Trigger Rule: ALL_DONE_MIN_ONE_SUCCESS
""""""""""""""""""""""""""""""""""""""""""""""
ALL_DONE_MIN_ONE_SUCCESS: This rule triggers when all upstream tasks are done (success, failed) and
at least one has succeeded, filling a gap between existing trigger rules for complex workflow patterns.
Skipped upstream tasks work as usually - they skip downstream task.
Enhanced DAG Processing Visibility """""""""""""""""""""""""""""""""""
DAG parsing duration is now exposed in the UI, providing better visibility into DAG processing performance and helping identify parsing bottlenecks. This information is displayed alongside other DAG metadata to assist with performance optimization.
Python 3.13 support added & 3.9 support removed """""""""""""""""""""""""""""""""""""""""""""""
Support for Python 3.9 has been removed, as it has reached end-of-life. Airflow 3.1.0 requires Python 3.10, 3.11, 3.12 or 3.13.
Configuration Changes and Cleanup """"""""""""""""""""""""""""""""""
Webserver Configuration Reorganization
Several webserver configuration options have been moved to the api section for better organization:
[webserver] log_fetch_timeout_sec → [api] log_fetch_timeout_sec[webserver] hide_paused_dags_by_default → [api] hide_paused_dags_by_default[webserver] page_size → [api] page_size[webserver] default_wrap → [api] default_wrap[webserver] require_confirmation_dag_change → [api] require_confirmation_dag_change[webserver] auto_refresh_interval → [api] auto_refresh_intervalUnused configuration options have been removed:
[webserver] instance_name_has_markup[webserver] warn_deployment_exposureAPI Server Logging Configuration
The API server configuration option [api] access_logfile has been replaced with [api] log_config to align with uvicorn's logging configuration. The new option accepts a path to a logging configuration file compatible with logging.config.fileConfig, providing more flexible logging configuration.
Security Improvement: XCom Deserialization
The enable_xcom_deserialize_support configuration option has been removed as a security improvement. This option previously allowed deserializing unknown objects in the API, which posed a security risk due to potential remote code execution vulnerabilities when deserializing arbitrary Python objects.
The XCom display improvements now handle showing non-native XComs (like custom objects, Assets, datetime objects) in a human-readable way through safer methods that don't require deserializing unknown objects in the API server. This provides better user experience when viewing XCom data in the Airflow UI while eliminating the security risk.
API Changes """""""""""
Asset API Key Rename
The consuming_dags key in asset API responses has been renamed to scheduled_dags to better reflect its purpose. This key contains only DAGs that use the asset in their schedule argument, not all DAGs that technically use the asset.
Task SDK Interface Changes """"""""""""""""""""""""""
Removed Functions
The following functions have been removed from the task-sdk (airflow.sdk.definitions.taskgroup) and moved to server-side API services:
get_task_group_children_gettertask_group_to_dictThese functions are now internal to Airflow's API layer and should not be imported directly by users.
Reduce default API server workers to 1 """"""""""""""""""""""""""""""""""""""
The default number of API server workers ([api] workers) has been reduced from 4 to 1.
With FastAPI, sync code runs in external thread pools, making multiple workers within a single process less necessary. Additionally, with uvicorn's spawn behavior instead of fork, there is no shared copy-on-write memory between workers, so horizontal scaling with multiple API server instances is now the recommended approach for better resource utilization and fault isolation.
A good starting point for the number of workers is to set it to the number of CPU cores available. If you do have multiple CPU cores available for the API server, consider deploying multiple API server instances instead of increasing the number of workers.
Airflow now uses structlog <https://www.structlog.org/en/stable/>_ everywhere
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Most users should not notice the difference, but it is now possible to emit structured log key/value pairs from tasks.
If your class subclasses LoggingMixin (which all BaseHook and BaseOperator do -- i.e. all hooks and
operators) then self.log is now a <structlog logger>_.
The advantage of using structured logging is that it is much easier to find specific information about log
message, especially when using a central store such as OpenSearch/Elastic/Splunk etc.
You don't have to make any changes, but you can now take advantage of this.
.. code-block:: python
# Inside a Task/Hook etc.
# Before:
# self.log.info("Registering adapter %r", item.name)
# Now:
self.log.info("Registering adapter", name=item.name)
This will produce a log that (in the UI) will look something like this::
[2025-09-16 10:36:13] INFO - Registering adapter name="adapter1"
or in JSON (i.e. the log files on disk)::
{"timestamp": "2025-09-16T10:36:13Z", "log_level": "info", "event": "Registering adapter", "name": "adapter1"}
You can also use structlog loggers at the top level of modules etc, and stdlib both continue to work:
.. code-block:: python
import logging
import structlog
log1 = logging.getLogger(__name__)
log2 = strcutlog.get_logger(__name__)
log1.info("Loading something from %s", __name__)
log2.info("Loading something", source=__name__)
(You can't add arbitrary key/value pairs to stdlib, but the normal percent-formatter approaches still work fine.)
Serialization Interface Changes """""""""""""""""""""""""""""""
The deserializer interface in airflow.serialization.serializers has changed for improved security.
Before 3.1.0:
def deserialize(classname: str, version: int, data: Any)
Starting with 3.1.0:
def deserialize(cls: type, version: int, data: Any)
The class loading is now handled in serde.py, and the deserializer receives the loaded class directly rather than a classname string.
This update avoids the use of import_string in the deserializer, making deserialization more secure.
New Features ^^^^^^^^^^^^
SQLAlchemy 2.0 support with various compatibility fixes for Python 3.13 (#52233, #52518, #54940)psycopg3 postgres driver (#52976)XCom browsing with filtering and improved navigation (#54049)HITLOperator, ApprovalOperator, HITLEntryOperator) for human decision workflows (#52868)has_import_errors filter to Core API GET /dags endpoint (#54563)/plugins API with warnings for invalid plugins (#55673)dag_display_name aliases for improved API consistency (#50332, #50065, #50014, #49933, #49641)XCom validation to prevent empty keys in XCom.set() and XCom.get() operations (#46929)iframe_views to backend plugin support (#51003)Bug Fixes ^^^^^^^^^
QUEUED runs with null start_date (#52668)ti_successes and related metrics in Airflow 3.0 Task SDK (#55322)clearTaskInstances API: Restore include_past/future support on UI (#54416)XCom access in DAG processor callbacks for notifiers (#55542)default_timezone is not UTC (#54431)lineno of logger calls are present in Task Logs (#55581)serialized_dag table (#54972)LocalExecutor race condition where tasks could start before database state was committed (#56010)Miscellaneous ^^^^^^^^^^^^^
dag_stale_not_seen_duration (#55601, #55684)dag_id column in DAG Runs and Task Instances pages for better navigation (#55648)Doc Only Changes ^^^^^^^^^^^^^^^^
--preview flag from ruff check instructions for Airflow 3 upgrade path (#55516)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
TriggerDagRunLink broken page when clicking "Triggered DAG" button (#54760)task_queued_timeout not working after first DAG run by properly resetting queued_by_job_id (#54604)KRB5CCNAME env) when running tasks with user impersonation (#54672)max_active_tasks persisting after removal from DAG code (#54639)Miscellaneous """""""""""""
axios UI dependency from 1.8.0 to 1.11.0 (#54733)pluggy to 1.6.0 (#54728, #54730)Doc Only Changes """"""""""""""""
get_parsing_context function (#54802)This release has been yanked.
Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
get_previous_dagrun functionality for task context (#53655)DetachedInstanceError when processing executor events (#54334)DetachedInstanceError when accessing DagRun.created_dag_version (#54362)"/" in the name in the UI (#54268)extra field in connections UI and API (#53963, #54034, #54235)MappedOperators within TaskGroups (#53532)XCom backends not being used when BaseXCom.get_all() is called (#53814)xcom_pull ignoring include_prior_dates parameter when map_indexes is not specified (#53809)AttributeError when reading logs for previous task attempts with TaskInstanceHistory (#54114)end_date and duration by populating TaskInstance data before invoking callbacks (#54458)AssetEvent queries in scheduler to maintain consistent event processing order (#52231)map_index for upstream tasks (#54249)Miscellaneous """""""""""""
common.messaging to 1.0.3 (#54176)Doc Only Changes """"""""""""""""
dag.log (#54463)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
DetachedInstanceError crashes (#53838) (#53858)on_kill functionality not working when tasks are killed externally in TaskSDK (#53718) (#53832)task_success_overtime configuration option not being configurable (#53342) (#53351)group_by clause in event logs query for performance (#53733) (#53807)apps flags for API server command configuration (#52929) (#53775)BaseOperator.executor attribute (#53496) (#53519)start_from_trigger functionality (#53744) (#53750)Miscellaneous """""""""""""
~= used in requires-python configuration (#52985) (#52987)Doc Only Changes """"""""""""""""
Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
xcom_pull to cover different scenarios for mapped tasks (#51568)run_as_user) support for task execution (#51780)exception to context for task callbacks (#52066)EOF is missed (#51180) (#51970)EventsTimetable's description during serialization (#51926)EOF detection of subprocesses in Dag Processor (#51895)dag.test (#51673)dag.test consistent with airflow dags test CLI command (#51476)No Status Filter (#52154)MappedOperator (#52681)AssetEventOperations.get to use alias_name when specified (#52324)start_from_trigger is True (#52873)example_external_task_parent_deferrable.py imports (#52957)no_status and duration for grid summaries (#53092)ti.log_url not in Task Context (#50376)XCom.get_all() method (#53102)Miscellaneous """""""""""""
connections_test CLI to use Connection instead of BaseHook (#51834) (#51917)libcst 1.8.1 for Python 3.9 (#51609)Doc Only Changes """"""""""""""""
Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
sys.path in task runner (#51318)sys.path in dag processor (#50385)SIGSEGV signals during DAG file imports (#51171)dag.test() (#51182)ForwardRef error by reordering discriminated union definitions (#50688)BaseNotifier (#50340)upstream_mapped_index when xcom access is needed (#50641)/run API endpoint for older Task SDK clientsFlexibleForm component (#50845)+1 more when tags exceed the display limit by one (#50669)default_args handling in operator .partial() to prevent TypeError when unused keys are present (#50525)airflow tasks clear command (#49631)--local flag in dag list and dag list-import-errors CLI commands (#49380)DagProcessor stats log to show the correct parse duration (#50316)get_log API (#50547)logical_date check when validating inlets and outlets (#51464)ti update state and set task to fail if exception encountered (#51295)Miscellaneous """""""""""""
example_dags in standard provider to example_dags in sources (#51275)airflow-core package (#51192)task.test to Task SDK (#50827)dag.test to Task SDK (#50300,#50419)ti.run to Task SDK execution path (#50141)airflow dags test from local files (#50420)execution_time module (#50940)dagrun value for list display (#50834)secret_key config to api section (#50839)webserver configs to fab provider (#50774,#50269,#50208,#50896)dag_run nullable in Details page (#50719)ab_user table (#50343)owner_links field to DAGDetailsResponse for enhanced owner metadata in the API (#50557)Doc Only Changes """"""""""""""""
AssetAlias for alias in Asset Metadata example (#50768)schedule_interval in tutorial dags (#50947)PythonOperator in tutorial dag (#50962)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
dag_code records with no serialized dag (#49478)dag_code and serialized_dag tables on 3.0 upgrade (#49563)scheduler_interval field on downgrade (#49583)base_url in api server (#49545)max_active_tis_per_dag is not respected by dynamically mapped tasks (#49708)SimpleAuthManager (#49697)(#49866)bundle_version to DagRun response (#49726)task_ids in xcom_pull the same as multiple when provided as part of a list (#49692)DAGModel stale and associate bundle on import errors to aid migration from 2.10.5 (#49769)pip with avoiding resolution too deep issues in Python 3.12 (#49853)BashSensor (#49935)map_index_template on task completion (#49809)ContinuousTimetable false triggering when last run ends in future (#45175)Stats (#50088)TaskGroup (#49996)mapIndex to clear the relevant task instances. (#50256)Miscellaneous """""""""""""
STRAIGHT_JOIN prefix for MySQL query optimization in get_sorted_triggers (#46303)sqlalchemy[asyncio] extra is in core deps (#49452)HANDLER_SUPPORTS_TRIGGERER (#49370)gitpython as a core dependency (#49537)@babel/runtime from 7.26.0 to 7.27.0 (#49479)get_current_context (#49630)RunBackfillForm (#49609)backfill_id (#49691)(#49716)SimpleAllAdminMiddleware to allow api usage without auth header in request (#49599)react-router and react-router-dom from 7.4.0 to 7.5.2 (#49742)root_dag_id in dagbag and restore logic (#49668)airflow.cfg files across all containers in default docker-compose.yaml (#49681)/airflow-core (#49512)apache-airflow meta package (#49846)@task.kuberenetes_cmd (#46913)vite from 5.4.17 to 5.4.19 for Airflow UI (#49162)(#50074)map_index filter option to GetTICount and GetTaskStates (#49818)stats ui endpoint (#49985)state attribute to RuntimeTaskInstance for easier ti.state access in Task Context (#50031)dag_run_conf to RunBackfillForm (#49763)dateInterval validation and error handling (#50072)Task Instances [{map_index}] tab to mapped task details (#50085)Doc Only Changes """"""""""""""""
security/api.rst (#49675)max_consecutive_failed_dag_runs default value to zero in TaskSDK dag (#49795) (#49803)example_params_ui_tutorial) (#49905)We are proud to announce the General Availability of Apache Airflow 3.0 — the most significant release in the project's history. This version introduces a service-oriented architecture, a stable DAG authoring interface, expanded support for event-driven and ML workflows, and a fully modernized UI built on React. Airflow 3.0 reflects years of community investment and lays the foundation for the next era of scalable, modular orchestration.
Highlights ^^^^^^^^^^
Service-Oriented Architecture: A new Task Execution API and airflow api-server enable task execution in remote environments with improved isolation and flexibility (AIP-72).
Edge Executor: A new executor that supports distributed, event-driven, and edge-compute workflows (AIP-69), now generally available.
Stable Authoring Interface: DAG authors should now use the new airflow.sdk namespace to import core DAG constructs like @dag, @task, and DAG.
Scheduler-Managed Backfills: Backfills are now scheduled and tracked like regular DAG runs, with native UI and API support (AIP-78).
DAG Versioning: Airflow now tracks structural changes to DAGs over time, enabling inspection of historical DAG definitions via the UI and API (AIP-66).
Asset-Based Scheduling: The dataset model has been renamed and redesigned as assets, with a new @asset decorator and cleaner event-driven DAG definition (AIP-74, AIP-75).
Support for ML and AI Workflows: DAGs can now run with logical_date=None, enabling use cases such as model inference, hyperparameter tuning, and non-interval workflows (AIP-83).
Removal of Legacy Features: SLAs, SubDAGs, DAG and Xcom pickling, and several internal context variables have been removed. Use the upgrade tools to detect deprecated usage.
Split CLI and API Changes: The CLI has been split into airflow and airflowctl (AIP-81), and REST API now defaults to logical_date=None when triggering a new DAG run.
Modern React UI: A complete UI overhaul built on React and FastAPI includes version-aware views, backfill management, and improved DAG and task introspection (AIP-38, AIP-84).
Migration Tooling: Use ruff and airflow config update to validate DAGs and configurations. Upgrade requires Airflow 2.7 or later and Python 3.9–3.12.
Significant Changes ^^^^^^^^^^^^^^^^^^^
Airflow 3.0 introduces the most significant set of changes since the 2.0 release, including architectural shifts, new execution models, and improvements to DAG authoring and scheduling.
Task Execution API & Task SDK (AIP-72) """"""""""""""""""""""""""""""""""""""
Airflow now supports a service-oriented architecture, enabling tasks to be executed remotely via a new Task Execution API. This API decouples task execution from the scheduler and introduces a stable contract for running tasks outside of Airflow's traditional runtime environment.
To support this, Airflow introduces the Task SDK — a lightweight runtime environment for running Airflow tasks in external systems such as containers, edge environments, or other runtimes. This lays the groundwork for language-agnostic task execution and brings improved isolation, portability, and extensibility to Airflow-based workflows.
Airflow 3.0 also introduces a new airflow.sdk namespace that exposes the core authoring interfaces for defining DAGs
and tasks. DAG authors should now import objects like DAG, @dag, and @task from airflow.sdk rather than
internal modules. This new namespace provides a stable, forward-compatible interface for DAG authoring across future
versions of Airflow.
Edge Executor (AIP-69) """"""""""""""""""""""
Airflow 3.0 introduces the Edge Executor as a generally available feature, enabling execution of tasks in distributed or remote compute environments. Designed for event-driven and edge-compute use cases, the Edge Executor integrates with the Task Execution API to support task orchestration beyond the traditional Airflow runtime. This advancement facilitates hybrid and cross-environment orchestration patterns, allowing task workers to operate closer to data or application layers.
Scheduler-Managed Backfills (AIP-78) """"""""""""""""""""""""""""""""""""
Backfills are now fully managed by the scheduler, rather than being launched as separate command-line jobs. This change unifies backfill logic with regular DAG execution and ensures that backfill runs follow the same scheduling, versioning, and observability models as other DAG runs.
Airflow 3.0 also introduces native UI and REST API support for initiating and monitoring backfills, making them more accessible and easier to integrate into automated workflows. These improvements lay the foundation for smarter, safer historical reprocessing — now available directly through the Airflow UI and API.
DAG Versioning (AIP-66) """""""""""""""""""""""
Airflow 3.0 introduces native DAG versioning. DAG structure changes (e.g., renamed tasks, dependency shifts) are now tracked directly in the metadata database. This allows users to inspect historical DAG structures through the UI and API, and lays the foundation for safer backfills, improved observability, and runtime-determined DAG logic.
Note: DAG bundles are not initialized in the triggerer. In practice, this means that triggers cannot come from a
DAG bundle. This is because the triggerer does not deal with changes in trigger code over time, as everything happens
in the main process. Triggers can come from anywhere else on sys.path instead.
React UI Rewrite (AIP-38, AIP-84) """""""""""""""""""""""""""""""""
Airflow 3.0 ships with a completely redesigned user interface built on React and FastAPI. This modern architecture improves responsiveness, enables more consistent navigation across views, and unlocks new UI capabilities — including support for DAG versioning, asset-centric DAG definitions, and more intuitive filtering and search.
The new UI replaces the legacy Flask-based frontend and introduces a foundation for future extensibility and community contributions.
Asset-Based Scheduling & Terminology Alignment (AIP-74, AIP-75) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The concept of Datasets has been renamed to Assets, unifying terminology with common practices in the modern data ecosystem. The internal model has also been reworked to better support future features like asset partitions and validations.
The @asset decorator and related changes to the DAG parser enable clearer, asset-centric DAG definitions, allowing
Airflow to more naturally support event-driven and data-aware scheduling patterns.
This renaming impacts modules, classes, functions, configuration keys, and internal models. Key changes include:
Dataset → AssetDatasetEvent → AssetEventDatasetAlias → AssetAliasairflow.datasets.* → airflow.sdk.*airflow.timetables.simple.DatasetTriggeredTimetable → airflow.timetables.simple.AssetTriggeredTimetableairflow.timetables.datasets.DatasetOrTimeSchedule → airflow.timetables.assets.AssetOrTimeScheduleairflow.listeners.spec.dataset.on_dataset_created → airflow.listeners.spec.asset.on_asset_createdairflow.listeners.spec.dataset.on_dataset_changed → airflow.listeners.spec.asset.on_asset_changedcore.dataset_manager_class → core.asset_manager_classcore.dataset_manager_kwargs → core.asset_manager_kwargsUnified Scheduling Field """"""""""""""""""""""""
Airflow 3.0 removes the legacy schedule_interval and timetable parameters. DAGs must now use the unified
schedule field for all time- and event-based scheduling logic. This simplifies DAG definition and improves
consistency across scheduling paradigms.
Updated Scheduling Defaults """""""""""""""""""""""""""
Airflow 3.0 changes the default behavior for new DAGs by setting catchup_by_default = False in the configuration
file. This means DAGs that do not explicitly set catchup=... will no longer backfill missed intervals by default.
This change reduces confusion for new users and better reflects the growing use of on-demand and event-driven workflows.
The default DAG schedule has been changed to None from @once.
Restricted Metadata Database Access """""""""""""""""""""""""""""""""""
Task code can no longer directly access the metadata database. Interactions with DAG state, task history, or DAG runs must be performed via the Airflow REST API or exposed context. This change improves architectural separation and enables remote execution.
Future Logical Dates No Longer Supported """""""""""""""""""""""""""""""""""""""""
Airflow no longer supports triggering DAG runs with a logical date in the future. This change aligns with the logical
execution model and removes ambiguity in backfills and event-driven DAGs. Use logical_date=None to trigger runs with
the current timestamp.
Context Behavior for Asset and Manually Triggered DAGs """"""""""""""""""""""""""""""""""""""""""""""""""""""
For DAG runs triggered by an Asset event or through the REST API without specifying a logical_date, Airflow now sets
logical_date=None by default. These DAG runs do not have a data interval, and attempting to access
data_interval_start, data_interval_end, or logical_date from the task context will raise a KeyError.
DAG authors should use dag_run.logical_date and perform appropriate checks or fallbacks if supporting multiple
trigger types. This change improves consistency with event-driven semantics but may require updates to existing DAGs
that assume these values are always present.
Improved Callback Behavior """"""""""""""""""""""""""
Airflow 3.0 refines task callback behavior to improve clarity and consistency. In particular, on_success_callback is
no longer executed when a task is marked as SKIPPED, aligning it more closely with expected semantics.
Updated Default Configuration """""""""""""""""""""""""""""
Several default configuration values have been updated in Airflow 3.0 to better reflect modern usage patterns and simplify onboarding:
catchup_by_default is now set to False by default. DAGs will not automatically backfill unless explicitly configured to do so.create_cron_data_intervals is now set to False by default. As a result, cron expressions will be interpreted using the CronTriggerTimetable instead of the legacy CronDataIntervalTimetable.SimpleAuthManager is now the default auth_manager. To continue using Flask AppBuilder-based authentication, install the apache-airflow-providers-fab provider and explicitly set auth_manager = airflow.providers.fab.auth_manager.FabAuthManager.These changes represent the most significant evolution of the Airflow platform since the release of 2.0 — setting the stage for more scalable, event-driven, and language-agnostic orchestration in the years ahead.
Executor & Scheduler Updates ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Airflow 3.0 introduces several important improvements and behavior changes in how DAGs and tasks are scheduled, prioritized, and executed.
Standalone DAG Processor Required """""""""""""""""""""""""""""""""
Airflow 3.0 now requires the standalone DAG processor to parse DAGs. This dedicated process improves scheduler performance, isolation, and observability. It also simplifies architecture by clearly separating DAG parsing from scheduling logic. This change may affect custom deployments that previously used embedded DAG parsing.
Priority Weight Capped by Pool Slots """""""""""""""""""""""""""""""""""""
The priority_weight value on a task is now capped by the number of available pool slots. This ensures that resource
availability remains the primary constraint in task execution order, preventing high-priority tasks from starving others
when resource contention exists.
Teardown Task Handling During DAG Termination """""""""""""""""""""""""""""""""""""""""""""
Teardown tasks will now be executed even when a DAG run is terminated early. This ensures that cleanup logic is respected, improving reliability for workflows that use teardown tasks to manage ephemeral infrastructure, temporary files, or downstream notifications.
Improved Scheduler Fault Tolerance """"""""""""""""""""""""""""""""""
Scheduler components now use run_with_db_retries to handle transient database issues more gracefully. This enhances
Airflow's fault tolerance in high-volume environments and reduces the likelihood of scheduler restarts due to temporary
database connection problems.
Mapped Task Stats Accuracy """""""""""""""""""""""""""
Airflow 3.0 fixes a bug that caused incorrect task statistics to be reported for dynamic task mapping. Stats now accurately reflect the number of mapped task instances and their statuses, improving observability and debugging for dynamic workflows.
SequentialExecutor has been removed
"""""""""""""""""""""""""""""""""""""""
SequentialExecutor was primarily used for local testing but is now redundant, as LocalExecutor
supports SQLite with WAL mode and provides better performance with parallel execution.
Users should switch to LocalExecutor or CeleryExecutor as alternatives.
DAG Authoring Enhancements ^^^^^^^^^^^^^^^^^^^^^^^^^^
Airflow 3.0 includes several changes that improve consistency, clarity, and long-term stability for DAG authors.
New Stable DAG Authoring Interface: airflow.sdk
"""""""""""""""""""""""""""""""""""""""""""""""""""
Airflow 3.0 introduces a new, stable public API for DAG authoring under the airflow.sdk namespace,
available via the apache-airflow-task-sdk package.
The goal of this change is to decouple DAG authoring from Airflow internals (Scheduler, API Server, etc.), providing a forward-compatible, stable interface for writing and maintaining DAGs across Airflow versions.
DAG authors should now import core constructs from airflow.sdk rather than internal modules.
Key Imports from airflow.sdk:
Classes:
AssetBaseNotifierBaseOperatorBaseOperatorLinkBaseSensorOperatorConnectionContextDAGEdgeModifierLabelObjectStoragePathParamTaskGroupVariableDecorators and Functions:
@asset@dag@setup@task@task_group@teardownchainchain_linearcross_downstreamget_current_contextget_parsing_contextFor an exhaustive list of available classes, decorators, and functions, check airflow.sdk.__all__.
All DAGs should update imports to use airflow.sdk instead of referencing internal Airflow modules directly.
Legacy import paths (e.g., airflow.models.dag.DAG, airflow.decorator.task) are deprecated and
will be removed in a future Airflow version. Some additional utilities and helper functions
that DAGs sometimes use from airflow.utils.* and others will be progressively migrated to the Task SDK in future
minor releases.
These future changes aim to complete the decoupling of DAG authoring constructs
from internal Airflow services. DAG authors should expect continued improvements
to airflow.sdk with no backwards-incompatible changes to existing constructs.
For example, update:
.. code-block:: python
# Old (Airflow 2.x)
from airflow.models import DAG
from airflow.decorators import task
# New (Airflow 3.x)
from airflow.sdk import DAG, task
Renamed Parameter: fail_stop → fail_fast
"""""""""""""""""""""""""""""""""""""""""""""""""
The DAG argument fail_stop has been renamed to fail_fast for improved clarity. This parameter controls whether a
DAG run should immediately stop execution when a task fails. DAG authors should update any code referencing
fail_stop to use the new name.
Context Cleanup and Parameter Removal """""""""""""""""""""""""""""""""""""
Several legacy context variables have been removed or may no longer be available in certain types of DAG runs, including:
confexecution_datedag_run.external_triggerIn asset-triggered and manually triggered DAG runs with logical_date=None, data interval fields such as
data_interval_start and data_interval_end may not be present in the task context. DAG authors should use
explicit references such as dag_run.logical_date and conditionally check for the presence of interval-related fields
where applicable.
Task Context Utilities Moved """"""""""""""""""""""""""""
Internal task context functions such as get_parsing_context have been moved to a more appropriate location (e.g.,
airflow.models.taskcontext). DAG authors using these utilities directly should update import paths accordingly.
Trigger Rule Restrictions """""""""""""""""""""""""
The TriggerRule.ALWAYS rule can no longer be used with teardown tasks or tasks that are expected to honor upstream
dependency semantics. DAG authors should ensure that teardown logic is defined with the appropriate trigger rules for
consistent task resolution behavior.
Asset Aliases for Reusability """""""""""""""""""""""""""""
A new utility function, create_asset_aliases(), allows DAG authors to define reusable aliases for frequently
referenced Assets. This improves modularity and reuse across DAG files and is particularly helpful for teams adopting
asset-centric DAGs.
Operator Links interface changed """"""""""""""""""""""""""""""""
The Operator Extra links, which can be defined either via plugins or custom operators now do not execute any user code in the Airflow UI, but instead push the "full" links to XCom backend and the link is fetched from the XCom backend when viewing task details, for example from grid view.
Example for users with custom links class:
.. code-block:: python
@attr.s(auto_attribs=True) class CustomBaseIndexOpLink(BaseOperatorLink): """Custom Operator Link for Google BigQuery Console."""
index: int = attr.ib()
@property
def name(self) -> str:
return f"BigQuery Console #{self.index + 1}"
@property
def xcom_key(self) -> str:
return f"bigquery_{self.index + 1}"
def get_link(self, operator, *, ti_key):
search_queries = XCom.get_one(
task_id=ti_key.task_id, dag_id=ti_key.dag_id, run_id=ti_key.run_id, key="search_query"
)
return f"https://console.cloud.google.com/bigquery?j={search_query}"
The link has an xcom_key defined, which is how it will be stored in the XCOM backend, with key as xcom_key and
value as the entire link, this case: https://console.cloud.google.com/bigquery?j=search
Plugins no longer support adding executors, operators & hooks """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Operator (including Sensors), Executors & Hooks can no longer be registered or imported via Airflow's plugin mechanism. These types of classes are just treated as plain Python classes by Airflow, so there is no need to register them with Airflow. They can be imported directly from their respective provider packages.
Before:
.. code-block:: python
from airflow.hooks.my_plugin import MyHook
You should instead import it as:
.. code-block:: python
from my_plugin import MyHook
Support for ML & AI Use Cases (AIP-83) """""""""""""""""""""""""""""""""""""""
Airflow 3.0 expands the types of DAGs that can be expressed by removing the constraint that each DAG run must correspond to a unique data interval. This change, introduced in AIP-83, enables support for workflows that don't operate on a fixed schedule — such as model training, hyperparameter tuning, and inference tasks.
These ML- and AI-oriented DAGs often run ad hoc, are triggered by external systems, or need to execute multiple times
with different parameters over the same dataset. By allowing multiple DAG runs with logical_date=None, Airflow now
supports these scenarios natively without requiring workarounds.
Config & Interface Changes ^^^^^^^^^^^^^^^^^^^^^^^^^^
Airflow 3.0 introduces several configuration and interface updates that improve consistency, clarify ownership of core utilities, and remove legacy behaviors that were no longer aligned with modern usage patterns.
Default Value Handling """"""""""""""""""""""
Airflow no longer silently updates configuration options that retain deprecated default values. Users are now required to explicitly set any config values that differ from the current defaults. This change improves transparency and prevents unintentional behavior changes during upgrades.
Refactored Config Defaults """""""""""""""""""""""""""
Several configuration defaults have changed in Airflow 3.0 to better reflect modern usage patterns:
catchup_by_default is now False. DAGs will not backfill missed intervals unless explicitly configured to do so.create_cron_data_intervals is now False. Cron expressions are now interpreted using the CronTriggerTimetable instead of the legacy CronDataIntervalTimetable. This change simplifies interval logic and aligns with the future direction of Airflow's scheduling system.Refactored Internal Utilities """""""""""""""""""""""""""""
Several core components have been moved to more intuitive or stable locations:
SecretsMasker class has been relocated to airflow.sdk.execution_time.secrets_masker.ObjectStoragePath utility previously located under airflow.io is now available via airflow.sdk.These changes simplify imports and reflect broader efforts to stabilize utility interfaces across the Airflow codebase.
Improved inlet_events, outlet_events, and triggering_asset_events
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Asset event mappings in the task context are improved to better support asset use cases, including new features introduced in AIP-74.
Events of an asset or asset alias are now accessed directly by a concrete object to avoid ambiguity. Using a str to access events is
no longer supported. Use an Asset or AssetAlias object, or Asset.ref to refer to an entity explicitly instead, such as::
outlet_events[Asset.ref(name="myasset")] # Get events for asset named "myasset".
outlet_events[AssetAlias(name="myalias")] # Get events for asset alias named "myalias".
Alternatively, two helpers for_asset and for_asset_alias are added as shortcuts::
outlet_events.for_asset(name="myasset") # Get events for asset named "myasset".
outlet_events.for_asset_alias(name="myalias") # Get events for asset alias named "myalias".
The internal representation of asset event triggers now also includes an explicit uri field, simplifying traceability and
aligning with the broader asset-aware execution model introduced in Airflow 3.0. DAG authors interacting directly with
inlet_events may need to update logic that assumes the previous structure.
Behaviour change in xcom_pull
"""""""""""""""""""""""""""""""""
In Airflow 2, the xcom_pull() method allowed pulling XComs by key without specifying task_ids, despite the fact that the underlying
DB model defines task_id as part of the XCom primary key. This created ambiguity: if two tasks pushed XComs with the same key,
xcom_pull() would pull whichever one happened to be first, leading to unpredictable behavior.
Airflow 3 resolves this inconsistency by requiring task_ids when pulling by key. This change aligns with the task-scoped nature of
XComs as defined by the schema, ensuring predictable and consistent behavior.
DAG Authors should update their dags to use task_ids if their dags used xcom_pull without task_ids such as::
kwargs["ti"].xcom_pull(key="key")
Should be updated to::
kwargs["ti"].xcom_pull(task_ids="task1", key="key")
Removed Configuration Keys """""""""""""""""""""""""""
As part of the deprecation cleanup, several legacy configuration options have been removed. These include:
[scheduler] allow_trigger_in_future[scheduler] use_job_schedule[scheduler] use_local_tz[scheduler] processor_poll_interval[logging] dag_processor_manager_log_location[logging] dag_processor_manager_log_stdout[logging] log_processor_filename_templateAll the webserver configurations have also been removed since API server now replaces webserver, so the configurations like below have no effect:
[webserver] allow_raw_html_descriptions[webserver] cookie_samesite[webserver] error_logfile[webserver] access_logformat[webserver] web_server_master_timeoutSeveral configuration options previously located under the [webserver] section have
been moved to the new [api] section. The following configuration keys have been moved:
[webserver] web_server_host → [api] host[webserver] web_server_port → [api] port[webserver] workers → [api] workers[webserver] web_server_worker_timeout → [api] worker_timeout[webserver] web_server_ssl_cert → [api] ssl_cert[webserver] web_server_ssl_key → [api] ssl_key[webserver] access_logfile → [api] access_logfileThe following DAG parsing configuration options were moved to the new [dag_processor] section:
[core] dag_file_processor_timeout → [dag_processor] dag_file_processor_timeout[scheduler] parsing_processes → [dag_processor] parsing_processes[scheduler] file_parsing_sort_mode → [dag_processor] file_parsing_sort_mode[scheduler] max_callbacks_per_loop → [dag_processor] max_callbacks_per_loop[scheduler] min_file_process_interval → [dag_processor] min_file_process_interval[scheduler] stale_dag_threshold → [dag_processor] stale_dag_threshold[scheduler] print_stats_interval → [dag_processor] print_stats_intervalUsers should review their airflow.cfg files or use the airflow config lint command to identify outdated or
removed options.
Upgrade Tooling """"""""""""""""
Airflow 3.0 includes improved support for upgrade validation. Use the following tools to proactively catch incompatible configs or deprecated usage patterns:
airflow config lint: Identifies removed or invalid config keysruff check --select AIR30 --preview: Flags removed interfaces and common migration issuesCLI & API Changes ^^^^^^^^^^^^^^^^^
Airflow 3.0 introduces changes to both the CLI and REST API interfaces to better align with service-oriented deployments and event-driven workflows.
Split CLI Architecture (AIP-81) """""""""""""""""""""""""""""""
The Airflow CLI has been split into two distinct interfaces:
airflow CLI now handles only local functionality (e.g., airflow tasks test, airflow dags list).airflowctl, distributed via the apache-airflow-client package.This change improves security and modularity for deployments that use Airflow in a distributed or API-first context.
REST API v2 replaces v1 """""""""""""""""""""""
The legacy REST API v1, previously built with Connexion and Marshmallow, has been replaced by a modern FastAPI-based REST API v2.
This new implementation improves performance, aligns more closely with web standards, and provides a consistent developer experience across the API and UI.
Key changes include stricter validation (422 errors instead of 400), the removal of the execution_date parameter in favor of logical_date, and more consistent query parameter handling.
The v2 API is now the stable, fully supported interface for programmatic access to Airflow, and also powers the new UI - achieving full feature parity between the UI and API.
For details, see the :doc:Airflow REST API v2 </stable-rest-api-ref> documentation.
REST API: DAG Trigger Behavior Updated """"""""""""""""""""""""""""""""""""""
The behavior of the POST /dags/{dag_id}/dagRuns endpoint has changed. If a logical_date is not explicitly
provided when triggering a DAG via the REST API, it now defaults to None.
This aligns with event-driven DAGs and manual runs in Airflow 3.0, but may break backward compatibility with scripts or
tools that previously relied on Airflow auto-generating a timestamped logical_date.
Removed CLI Flags and Commands """"""""""""""""""""""""""""""
Several deprecated CLI arguments and commands that were marked for removal in earlier versions have now been cleaned up
in Airflow 3.0. Run airflow --help to review the current set of available commands and arguments.
Deprecated --ignore-depends-on-past cli option is replaced by --depends-on-past ignore.
--tree flag for airflow tasks list command is removed. The format of the output with that flag can be
expensive to generate and extremely large, depending on the DAG. airflow dag show is a better way to
visualize the relationship of tasks in a DAG.
Changing dag_id from flag (-d, --dag-id) to a positional argument in the dags list-runs CLI command.
The airflow db init and airflow db upgrade commands have been removed. Use airflow db migrate instead
to initialize or migrate the metadata database. If you would like to create default connections use
airflow connections create-default-connections.
airflow api-server has replaced airflow webserver cli command.
Provider Refactor & Standardization ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Airflow 3.0 completes the migration of several core operators, sensors, hooks, and triggers into the new
apache-airflow-providers-standard package. This package now includes commonly used components such as:
PythonOperator, BashOperatorExternalTaskSensor, FileSensorShortCircuitOperator, LatestOnlyOperatorSubprocessHook, FilesystemHookDateTimeTrigger, TimeDeltaTrigger, FileTriggerThese operators, sensors, hooks, and triggers were previously bundled inside airflow-core but are now treated as provider-managed components to
improve modularity, testability, and lifecycle independence.
This change enables more consistent versioning across providers and prepares Airflow for a future where all integrations — including "standard" ones — follow the same interface model.
To maintain compatibility with existing DAGs, the apache-airflow-providers-standard package is installable on both
Airflow 2.x and 3.x. Users upgrading from Airflow 2.x are encouraged to begin updating import paths and testing provider
installation in advance of the upgrade.
Legacy imports such as airflow.operators.python.PythonOperator are deprecated and will be removed soon. They should be
replaced with:
.. code-block:: python
from airflow.providers.standard.operators.python import PythonOperator
The SimpleHttpOperator has been migrated to apache-airflow-providers-http and renamed to HttpOperator
UI & Usability Improvements ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Airflow 3.0 introduces a modernized user experience that complements the new React-based UI architecture (see Significant Changes). Several areas of the interface have been enhanced to improve visibility, consistency, and navigability.
New Home Page """""""""""""
The Airflow Home page now provides a high-level operational overview of your environment. It includes health checks for core components (Scheduler, Triggerer, DAG Processor), summary stats for DAG and task instance states, and a real-time feed of asset-triggered events. This view helps users quickly identify pipeline health, recent activity, and potential failures.
Unified DAG List View """"""""""""""""""""""
The DAG List page has been refreshed with a cleaner layout and improved responsiveness. Users can browse DAGs by name, tags, or owners. While full-text search has not yet been integrated, filters and navigation have been refined for clarity in large deployments.
Version-Aware Graph and Grid Views """""""""""""""""""""""""""""""""""
The Graph and Grid views now display task information in the context of the DAG version that was used at runtime. This improves traceability for DAGs that evolve over time and provides more accurate debugging of historical runs.
Expanded DAG Graph Visualization """"""""""""""""""""""""""""""""
The Graph view now supports visualizing the full chain of asset and task dependencies, including assets consumed or produced across DAG boundaries. This allows users to inspect upstream and downstream lineage in a unified view, making it easier to trace data flows, debug triggering behavior, and understand conditional dependencies between assets and tasks.
DAG Code View """""""""""""
The "Code" tab now displays the exact DAG source as parsed by the scheduler for the selected DAG version. This allows users to inspect the precise code that was executed, even for historical runs, and helps debug issues related to versioned DAG changes.
Improved Task Log Access """""""""""""""""""""""""
Task log access has been streamlined across views. Logs are now easier to access from both the Grid and Task Instance pages, with cleaner formatting and reduced visual noise.
Enhanced Asset and Backfill Views """"""""""""""""""""""""""""""""""
New UI components support asset-centric DAGs and backfill workflows:
These improvements make Airflow more accessible to operators, data engineers, and stakeholders working across both time-based and event-driven workflows.
Deprecations & Removals ^^^^^^^^^^^^^^^^^^^^^^^^
A number of deprecated features, modules, and interfaces have been removed in Airflow 3.0, completing long-standing migrations and cleanups.
Users are encouraged to review the following removals to ensure compatibility:
SubDag support has been removed entirely, including the SubDagOperator, related CLI and API interfaces. TaskGroups are now the recommended alternative for nested DAG structures.
SLAs have been removed: The legacy SLA feature, including SLA callbacks and metrics, has been removed. A more flexible replacement mechanism, DeadlineAlerts, is planned for a future version of Airflow. Users who relied on SLA-based notifications should consider implementing custom alerting using task-level success/failure hooks or external monitoring integrations.
Pickling support has been removed: All legacy features related to DAG pickling have been fully removed. This includes the PickleDag CLI/API, as well as implicit behaviors around store_serialized_dags = False. DAGs must now be serialized using the JSON-based serialization system. Ensure any custom Python objects used in DAGs are JSON-serializable.
Context parameter cleanup: Several previously available context variables have been removed from the task execution context, including conf, execution_date, and dag_run.external_trigger. These values are either no longer applicable or have been renamed (e.g., use dag_run.logical_date instead of execution_date). DAG authors should ensure that templated fields and Python callables do not reference these deprecated keys.
Deprecated core imports have been fully removed. Any use of airflow.operators.*, airflow.hooks.*, or similar legacy import paths should be updated to import from their respective providers.
Configuration cleanup: Several legacy config options have been removed, including:
scheduler.allow_trigger_in_future: DAG runs can no longer be triggered with a future logical date. Use logical_date=None instead.scheduler.use_job_schedule and scheduler.use_local_tz have also been removed. These options were deprecated and no longer had any effect.Deprecated utility methods such as those in airflow.utils.helpers, airflow.utils.process_utils, and airflow.utils.timezone have been removed. Equivalent functionality can now be found in the standard Python library or Airflow provider modules.
Removal of deprecated CLI flags and behavior: Several CLI entrypoints and arguments that were marked for removal in earlier versions have been cleaned up.
To assist with the upgrade, tools like ruff (e.g., rule AIR302) and airflow config lint can help identify
obsolete imports and configuration keys. These utilities are recommended for locating and resolving common
incompatibilities during migration. Please see :doc:Upgrade Guide <installation/upgrading_to_airflow3> for more
information.
Summary of Removed Features """""""""""""""""""""""""""
The following table summarizes user-facing features removed in 3.0 and their recommended replacements. Not all of these are called out individually above.
+-------------------------------------------+----------------------------------------------------------+
| Feature | Replacement / Notes |
+===========================================+==========================================================+
| SubDagOperator / SubDAGs | Use TaskGroups |
+-------------------------------------------+----------------------------------------------------------+
| SLA callbacks / metrics | Deadline Alerts (planned post-3.0) |
+-------------------------------------------+----------------------------------------------------------+
| DAG Pickling | Use JSON serialization; pickling is no longer supported |
+-------------------------------------------+----------------------------------------------------------+
| Xcom Pickling | Use custom Xcom backend; pickling is no longer supported |
+-------------------------------------------+----------------------------------------------------------+
| execution_date context var | Use dag_run.logical_date |
+-------------------------------------------+----------------------------------------------------------+
| conf and dag_run.external_trigger | Removed from context; use DAG params or dag_run APIs |
+-------------------------------------------+----------------------------------------------------------+
| Core EmailOperator | Use EmailOperator from the smtp provider |
+-------------------------------------------+----------------------------------------------------------+
| none_failed_or_skipped rule | Use none_failed_min_one_success |
+-------------------------------------------+----------------------------------------------------------+
| dummy trigger rule | Use always |
+-------------------------------------------+----------------------------------------------------------+
| fail_stop argument | Use fail_fast |
+-------------------------------------------+----------------------------------------------------------+
| store_serialized_dags=False | DAGs are always serialized; config has no effect |
+-------------------------------------------+----------------------------------------------------------+
| Deprecated core imports | Import from appropriate provider package |
+-------------------------------------------+----------------------------------------------------------+
| SequentialExecutor & DebugExecutor| Use LocalExecutor for testing |
+-------------------------------------------+----------------------------------------------------------+
| .airflowignore regex | Uses glob syntax by default |
+-------------------------------------------+----------------------------------------------------------+
Migration Tooling & Upgrade Process ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Airflow 3 was designed with migration in mind. Many Airflow 2 DAGs will work without changes, especially if deprecation
warnings were addressed in earlier releases. To support the upgrade, Airflow 3 includes validation tools such as ruff
and airflow config update, as well as a simplified startup model.
For a step-by-step upgrade process, see the :doc:Upgrade Guide <installation/upgrading_to_airflow3>.
Minimum Supported Versions """""""""""""""""""""""""""
To upgrade to Airflow 3.0, you must be running Airflow 2.7 or later.
Airflow 3.0 supports the following Python versions:
Earlier versions of Airflow or Python are not supported due to architectural changes and updated dependency requirements.
DAG Compatibility Checks """""""""""""""""""""""""
Airflow now includes a Ruff-based linter with custom rules to detect DAG patterns and interfaces that are no longer
compatible with Airflow 3.0. These checks are packaged under the AIR30x rule series. Example usage:
.. code-block:: bash
ruff check dags/ --select AIR301 --preview
ruff check dags/ --select AIR301 --fix --preview
These checks can automatically fix many common issues such as renamed arguments, removed imports, or legacy context variable usage.
Configuration Migration """""""""""""""""""""""
Airflow 3.0 introduces a new utility to validate and upgrade your Airflow configuration file:
.. code-block:: bash
airflow config update
airflow config update --fix
This utility detects removed or deprecated configuration options and, if desired, updates them in-place.
Additional validation is available via:
.. code-block:: bash
airflow config lint
This command surfaces obsolete configuration keys and helps align your environment with Airflow 3.0 requirements.
Metadata Database Upgrade """""""""""""""""""""""""
As with previous major releases, the Airflow 3.0 upgrade includes schema changes to the metadata database. Before upgrading, it is strongly recommended that you back up your database and optionally run:
.. code-block:: bash
airflow db clean
to remove old task instance, log, or XCom data. To apply the new schema:
.. code-block:: bash
airflow db migrate
Startup Behavior Changes """""""""""""""""""""""""
Airflow components are now started explicitly. For example:
.. code-block:: bash
airflow api-server # Replaces airflow webserver
airflow dag-processor # Required in all environments
These changes reflect Airflow's new service-oriented architecture.
Resources ^^^^^^^^^
Upgrade Guide <installation/upgrading_to_airflow3>Airflow AIPs <https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals>_Airflow 3.0 represents more than a year of collaboration across hundreds of contributors and dozens of organizations. We thank everyone who helped shape this release through design discussions, code contributions, testing, documentation, and community feedback. For full details, migration guidance, and upgrade best practices, refer to the official Upgrade Guide and join the conversation on the Airflow dev and user mailing lists.
Significant Changes ^^^^^^^^^^^^^^^^^^^
Python 3.9 support removed """"""""""""""""""""""""""
Support for Python 3.9 has been removed, as it has reached end-of-life. Airflow 2.11.1 requires Python 3.10, 3.11, or 3.12. Note that this is unusual to remove Python version support in patch-level release of Airflow, but since Python 3.9 is already end-of-life, many libraries do not support it any more, and Airflow 2.11.1 is focused on improving security by upgrading dependencies, so we decided to remove Python 3.9 support in this patch release, to improve security of the release. Python 3.10 and 3.11 had almost no backward-incompatible changes, so you should be able to upgrade to Python 3.10 or 3.11 easily. If you were using Python 3.9 before, it is recommended to first upgrade Python version in existing installation and then upgrade to Airflow 2.11.1.
Publishing timer and timing metrics in seconds is now deprecated """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
In Airflow 3.0, the timer_unit_consistency setting in the metrics section will be
enabled by default and setting itself will be removed. This will standardize all timer and timing metrics to
milliseconds across all metric loggers.
Users Integrating with Datadog, OpenTelemetry, or other metric backends should enable this setting. For users, using
statsd, this change will not affect you.
If you need backward compatibility, you can leave this setting disabled temporarily, but enabling
timer_unit_consistency is encouraged to future-proof your metrics setup. (#39908)
Retrieving historical log templates is disabled in Airflow 2.11.1 """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
When you change the log template in Airflow 2.11.1, the historical log templates are not retrieved. This means that if you have existing logs that were generated using a different log template, they will not be accessible using the new log template.
This change is due to potential security issues that could arise from retrieving historical log templates, which allow Dag Authors to execute arbitrary code in webserver when retrieving logs. By disabling the retrieval of historical log templates, Airflow 2.11.1 aims to enhance the security of the system and prevent potential vulnerabilities in case the potential of executing arbitrary code in webserver is important for Airflow deployment.
Users who need to access historical logs generated with a different log template will need to manually update their log files to match the naming of their historical log files with the latest log template configured in Airflow configuration, or they can set the "core.use_historical_filename_templates" configuration option to True to enable the retrieval of historical log templates, if they are fine with the Dag Authors being able to execute arbitrary code in webserver when retrieving logs. (#61880)
Updated dependencies """"""""""""""""""""
Airflow 2.11.1 includes updates to a number of dependencies including connexion, Flask-Session, Werkzeug, that were not possible to upgrade before, because the dependencies did not have compatible versions with Airflow 2.11.0, but we worked together with the community to update them. Many thanks to connexion team and a number of community members to help with the updates so that we could upgrade to newer versions and get rid of some dependency versions that had known security vulnerabilities (#51681)
Bug fixes """""""""
Significant Changes ^^^^^^^^^^^^^^^^^^^
DeltaTriggerTimetable for trigger-based scheduling (#47074)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
This change introduces DeltaTriggerTimetable, a new built-in timetable that complements the existing suite of Airflow timetables by supporting delta-based trigger schedules without relying on data intervals.
Airflow currently has two major types of timetables:
- Data interval-based (e.g., CronDataIntervalTimetable, DeltaDataIntervalTimetable)
- Trigger-based (e.g., CronTriggerTimetable)
However, there was no equivalent trigger-based option for delta intervals like timedelta(days=1).
As a result, even simple schedules like schedule=timedelta(days=1) were interpreted through a data interval
lens—adding unnecessary complexity for users who don't care about upstream/downstream data dependencies.
This feature is backported to Airflow 2.11.0 to help users begin transitioning before upgrading to Airflow 3.0.
- In Airflow 2.11, ``schedule=timedelta(...)`` still defaults to ``DeltaDataIntervalTimetable``.
- A new config option ``[scheduler] create_delta_data_intervals`` (default: ``True``) allows opting in to ``DeltaTriggerTimetable``.
- In Airflow 3.0, this config defaults to ``False``, meaning ``DeltaTriggerTimetable`` becomes the default for timedelta schedules.
By flipping this config in 2.11, users can preview and adopt the new scheduling behavior in advance — minimizing surprises during upgrade.
Consistent timing metrics across all backends (#39908, #43966) """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Previously, Airflow reported timing metrics in milliseconds for StatsD but in seconds for other backends
such as OpenTelemetry and Datadog. This inconsistency made it difficult to interpret or compare
timing metrics across systems.
Airflow 2.11 introduces a new config option:
[metrics] timer_unit_consistency (default: False in 2.11, True and dropped in Airflow 3.0).When enabled, all timing metrics are consistently reported in milliseconds, regardless of the backend.
This setting has become mandatory and always True in Airflow 3.0 (the config will be removed), so
enabling it in 2.11 allows users to migrate early and avoid surprises during upgrade.
Ease migration to Airflow 3 """"""""""""""""""""""""""" This release introduces several changes to help users prepare for upgrading to Airflow 3:
execution_date now also include a logical_date field. Airflow 3 drops execution_date entirely in favor of logical_date (#44283)airflow config lint and airflow config update commands in 2.11 to help audit and migrate configs for Airflow 3.0. (#45736, #50353, #46757)Python 3.8 support removed """""""""""""""""""""""""" Support for Python 3.8 has been removed, as it has reached end-of-life. Airflow 2.11 requires Python 3.9, 3.10, 3.11, or 3.12.
New Features """"""""""""
DeltaTriggerTimetable (#47074)airflow config update and airflow config lint changes to ease migration to Airflow 3 (#45736, #50353)Bug Fixes """""""""
ti.log_url timestamp format from "%Y-%m-%dT%H:%M:%S%z" to "%Y-%m-%dT%H:%M:%S.%f%z" (#50306)airflow.cfg contains a random fernet_key and secret_key (#47755)rendered_map_index via internal api (#49057)TaskInstancePydantic into TaskInstance (#48571)log_url property on TaskInstancePydantic (Internal API) (#50560)TypeError when deserializing task with execution_timeout set to None (#46822)check_query_exists returns a bool (#46707)/xcom/list got exception when applying filter on the value column (#46053)Miscellaneous """""""""""""
logical_date to models using execution_date (#44283)BaseOperatorLink.get_link signature (#46448)Doc Only Changes """"""""""""""""
airflow.cfg variable (#48084)XCom docs to show examples of pushing multiple XComs (#46284, #47068)Significant Changes ^^^^^^^^^^^^^^^^^^^
Ensure teardown tasks are executed when DAG run is set to failed (#45530) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Previously when a DAG run was manually set to "failed" or to "success" state the terminal state was set to all tasks. But this was a gap for cases when setup- and teardown tasks were defined: If teardown was used to clean-up infrastructure or other resources, they were also skipped and thus resources could stay allocated.
As of now when setup tasks had been executed before and the DAG is manually set to "failed" or "success" then teardown tasks are executed. Teardown tasks are skipped if the setup was also skipped.
As a side effect this means if the DAG contains teardown tasks, then the manual marking of DAG as "failed" or "success" will need to keep the DAG in running state to ensure that teardown tasks will be scheduled. They would not be scheduled if the DAG is directly set to "failed" or "success".
Bug Fixes """""""""
trigger_rule=TriggerRule.ALWAYS in a task-generated mapping within bare tasks (#44751)ONE_DONE) in a mapped task group (#44937)FileTaskHandler only read from default executor (#46000)skip_if and run_if decorators before TaskFlow virtualenv tasks are run (#41832) (#45680)rendered_map_index (#45109) (#45122)max_form_parts, max_form_memory_size (#46243) (#45749)execute safeguard mechanism (#44646) (#46280)Miscellaneous """""""""""""
conf from Task Context (#44993)Significant Changes ^^^^^^^^^^^^^^^^^^^
TaskInstance priority_weight is capped in 32-bit signed integer ranges (#43611)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Some database engines are limited to 32-bit integer values. As some users reported errors in
weight rolled-over to negative values, we decided to cap the value to the 32-bit integer. Even
if internally in python smaller or larger values to 64 bit are supported, priority_weight is
capped and only storing values from -2147483648 to 2147483647.
Bug Fixes ^^^^^^^^^
trigger_rule="always" in a dynamic mapped task (#43810)trigger_rule=TriggerRule.ALWAYS in a task-generated mapping within bare tasks (#44751)Doc Only Changes """"""""""""""""
Miscellaneous """""""""""""
Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
stringified objects to UI via xcom if pickling is active (#42388) (#42486)selectinload instead of joinedload (#40487) (#42351)TrySelector for Mapped Tasks in Logs and Details Grid Panel (#43566)scheduler_loop_duration (#42886) (#43544)Miscellaneous """""""""""""
dompurify from 2.2.9 to 2.5.6 in /airflow/www (#42263) (#42270)4.5.2 (#43309) (#43318)Doc Only Changes """"""""""""""""
Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
renderedTemplates as keys to skip camelCasing (#42206) (#42208)camelcase xcom entries (#42182) (#42187)Miscellaneous """""""""""""
0.2.4 as it breaks our integration (#42101)LibCST (#42089)--tree flag for tasks list cli command (#41965)Doc Only Changes """"""""""""""""
security_model.rst to clear unauthenticated endpoints exceptions (#42085)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
__name__ (#41699)tojson filter to example_inlet_event_extra example dag (#41890)Miscellaneous """""""""""""
Doc Only Changes """"""""""""""""
keycloak (#41791)Significant Changes ^^^^^^^^^^^^^^^^^^^
Scarf based telemetry: Airflow now collect telemetry data (#39510)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Airflow integrates Scarf to collect basic usage data during operation. Deployments can opt-out of data collection by
setting the [usage_data_collection]enabled option to False, or the SCARF_ANALYTICS=false environment variable.
Datasets no longer trigger inactive DAGs (#38891) """""""""""""""""""""""""""""""""""""""""""""""""
Previously, when a DAG is paused or removed, incoming dataset events would still trigger it, and the DAG would run when it is unpaused or added back in a DAG file. This has been changed; a DAG's dataset schedule can now only be satisfied by events that occur when the DAG is active. While this is a breaking change, the previous behavior is considered a bug.
The behavior of time-based scheduling is unchanged, including the timetable part
of DatasetOrTimeSchedule.
try_number is no longer incremented during task execution (#39336)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Previously, the try number (try_number) was incremented at the beginning of task execution on the worker. This was problematic for many reasons.
For one it meant that the try number was incremented when it was not supposed to, namely when resuming from reschedule or deferral. And it also resulted in
the try number being "wrong" when the task had not yet started. The workarounds for these two issues caused a lot of confusion.
Now, instead, the try number for a task run is determined at the time the task is scheduled, and does not change in flight, and it is never decremented. So after the task runs, the observed try number remains the same as it was when the task was running; only when there is a "new try" will the try number be incremented again.
One consequence of this change is, if users were "manually" running tasks (e.g. by calling ti.run() directly, or command line airflow tasks run),
try number will no longer be incremented. Airflow assumes that tasks are always run after being scheduled by the scheduler, so we do not regard this as a breaking change.
/logout endpoint in FAB Auth Manager is now CSRF protected (#40145)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The /logout endpoint's method in FAB Auth Manager has been changed from GET to POST in all existing
AuthViews (AuthDBView, AuthLDAPView, AuthOAuthView, AuthOIDView, AuthRemoteUserView), and
now includes CSRF protection to enhance security and prevent unauthorized logouts.
OpenTelemetry Traces for Apache Airflow (#37948). """"""""""""""""""""""""""""""""""""""""""""""""" This new feature adds capability for Apache Airflow to emit 1) airflow system traces of scheduler, triggerer, executor, processor 2) DAG run traces for deployed DAG runs in OpenTelemetry format. Previously, only metrics were supported which emitted metrics in OpenTelemetry. This new feature will add richer data for users to use OpenTelemetry standard to emit and send their trace data to OTLP compatible endpoints.
Decorator for Task Flow (@skip_if, @run_if) to make it simple to apply whether or not to skip a Task. (#41116)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
This feature adds a decorator to make it simple to skip a Task.
Using Multiple Executors Concurrently (#40701)
""""""""""""""""""""""""""""""""""""""""""""""
Previously known as hybrid executors, this new feature allows Airflow to use multiple executors concurrently. DAGs, or even individual tasks, can be configured
to use a specific executor that suits its needs best. A single DAG can contain tasks all using different executors. Please see the Airflow documentation for
more details. Note: This feature is still experimental. See documentation on Executor <https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/executor/index.html#using-multiple-executors-concurrently>_ for a more detailed description.
New Features """"""""""""
AIP-61 <https://github.com/apache/airflow/pulls?q=is%3Apr+label%3Aarea%3Ahybrid-executors+is%3Aclosed+milestone%3A%22Airflow+2.10.0%22>_)AIP-62 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-62+milestone%3A%22Airflow+2.10.0%22>_)AIP-64 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-64+milestone%3A%22Airflow+2.10.0%22>_)AIP-44 <https://github.com/apache/airflow/pulls?q=is%3Apr+label%3AAIP-44+milestone%3A%22Airflow+2.10.0%22+is%3Aclosed>_)accessors to read dataset events defined as inlet (#39367)dag test (#40010)endDate in task instance tooltip. (#39547)accessors to read dataset events defined as inlet (#39367, #39893)run_if & skip_if decorators (#41116)Improvements """"""""""""
renderedjson component (#40964)get_extra_dejson method with nested parameter which allows you to specify if you want the nested json as string to be also deserialized (#39811)__getattr__ to task decorator stub (#39425)RemovedIn20Warning in airflow task command (#39244)db migrate error messages (#39268)suppress_and_warn warning (#39263)declarative_base from sqlalchemy.orm instead of sqlalchemy.ext.declarative (#39134)on_task_instance_failed access to the error that caused the failure (#38155)output_processor parameter to BashProcessor (#40843)Bug Fixes """""""""
never_fail in BaseSensor (#40915)start_date (#40878)external_task_group_id to WorkflowTrigger (#39617)BaseSensorOperator introduce skip_policy parameter (#40924)__init__ (#41086)Miscellaneous """""""""""""
OTel Traces (#40874)pydocstyle rules to pyproject.toml (#40569)pydocstyle rule D213 in ruff. (#40448, #40464)Dag.test() to run with an executor if desired (#40205)AirflowInternalRuntimeError for raise non catchable errors (#38778)pytest to 8.0+ (#39450)back_populates between DagScheduleDatasetReference.dag and DagModel.schedule_dataset_references (#39392)B028 (no-explicit-stacklevel) in core (#39123)ImportError to ParseImportError for avoid shadowing with builtin exception (#39116)SubDagOperator examples warnings (#39057)model_dump instead of dict for serialize Pydantic V2 model (#38933)hatchling to latest version (1.22.5) (#38780)ws from 7.5.5 to 7.5.10 in /airflow/www (#40288)Doc Only Changes """"""""""""""""
filesystems and dataset-uris to "how to create your own provider" page (#40801)otel_on to True in example airflow.cfg (#40712)task_id from send_email to send_email_notification in taskflow.rst (#41060)Significant Changes ^^^^^^^^^^^^^^^^^^^
Time unit for scheduled_duration and queued_duration changed (#37936)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
scheduled_duration and queued_duration metrics are now emitted in milliseconds instead of seconds.
By convention all statsd metrics should be emitted in milliseconds, this is later expected in e.g. prometheus statsd-exporter.
Support for OpenTelemetry Metrics is no longer "Experimental" (#40286) """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Experimental support for OpenTelemetry was added in 2.7.0 since then fixes and improvements were added and now we announce the feature as stable.
Bug Fixes """""""""
[webserver]update_fab_perms to deprecated configs (#40317)httpx to requests in file_task_handler (#39799)Doc Only Changes """"""""""""""""
Miscellaneous """""""""""""
hatchling as build dependency (#40387)SchedulerJobRunner._process_executor_events (#40563)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
AirflowSecurityManagerV2 leave transactions in the idle in transaction state (#39935)SafeDogStatsdLogger to use get_validator to enable pattern matching (#39370)has_access (#39421)execution_date in @apply_lineage (#39327)sql_alchemy_engine_args config example (#38971)Miscellaneous """""""""""""
yandex provider to avoid mypy errors (#39990)provider_info_cache decorator (#39750)defer (#39742)idx_last_scheduling_decision on dag_run table (#39275)Doc Only Changes """"""""""""""""
CronDataIntervalTimetable (#39780)Significant Changes ^^^^^^^^^^^^^^^^^^^
Stackdriver logging bugfix requires Google provider 10.17.0 or later (#38071)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
If you use Stackdriver logging, you must use Google provider version 10.17.0 or later. Airflow 2.9.1 now passes gcp_log_name to the StackdriverTaskHandler instead of name, and this will fail on earlier provider versions.
This fixes a bug where the log name configured in [logging] remove_base_log_folder was overridden when Airflow configured logging, resulting in task logs going to the wrong destination.
Bug Fixes """""""""
href for nav bar (#39282)firefox (#39261)log_url (#39183)UX (#39119)ux in react dag page (#39122)AUTH_ROLE_PUBLIC is set in check_authentication (#39012)map_index_template so it renders for failed tasks as long as it was defined before the point of failure (#38902)Undeprecate BaseXCom.get_one method for now (#38991)inherit_cache attribute for CreateTableAs custom SA Clause (#38985)SAWarning 'Coercing Subquery object into a select() for use in IN()' (#38926)cartesian product in AirflowSecurityManagerV2 (#38913)methodtools.lru_cache instead of functools.lru_cache in class methods (#37757)airflow dags backfill only if -I / --ignore-first-depends-on-past provided (#38676)Miscellaneous """""""""""""
TriggerDagRunOperator deprecate execution_date in favor of logical_date (#39285)@deprecated decorator (#39205)is_authorized_custom_view from auth manager to handle custom actions (#39167)minischeduler skip (#38976)undici from 5.28.3 to 5.28.4 in /airflow/www (#38751)Doc Only Changes """"""""""""""""
PythonOperator op_kwargs (#39242)user and role commands (#39224)k8s 1.29 to supported version in docs (#39168)DagBag class docstring to include all params (#38814)Significant Changes ^^^^^^^^^^^^^^^^^^^
Following Listener API methods are considered stable and can be used for production system (were experimental feature in older Airflow versions) (#36376): """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" Lifecycle events:
on_startingbefore_stoppingDagRun State Change Events:
on_dag_run_runningon_dag_run_successon_dag_run_failedTaskInstance State Change Events:
on_task_instance_runningon_task_instance_successon_task_instance_failedSupport for Microsoft SQL-Server for Airflow Meta Database has been removed (#36514) """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
After discussion <https://lists.apache.org/thread/r06j306hldg03g2my1pd4nyjxg78b3h4>__
and a voting process <https://lists.apache.org/thread/pgcgmhf6560k8jbsmz8nlyoxosvltph2>__,
the Airflow's PMC members and Committers have reached a resolution to no longer maintain MsSQL as a
supported Database Backend.
As of Airflow 2.9.0 support of MsSQL has been removed for Airflow Database Backend.
A migration script which can help migrating the database before upgrading to Airflow 2.9.0 is available in
airflow-mssql-migration repo on GitHub <https://github.com/apache/airflow-mssql-migration>_.
Note that the migration script is provided without support and warranty.
This does not affect the existing provider packages (operators and hooks), DAGs can still access and process data from MsSQL.
Dataset URIs are now validated on input (#37005) """"""""""""""""""""""""""""""""""""""""""""""""
Datasets must use a URI that conform to rules laid down in AIP-60, and the value
will be automatically normalized when the DAG file is parsed. See
documentation on Datasets <https://airflow.apache.org/docs/apache-airflow/2.9.0/authoring-and-scheduling/datasets.html>_ for
a more detailed description on the rules.
You may need to change your Dataset identifiers if they look like a URI, but are used in a less mainstream way, such as relying on the URI's auth section, or have a case-sensitive protocol name.
The method get_permitted_menu_items in BaseAuthManager has been renamed filter_permitted_menu_items (#37627)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Add REST API actions to Audit Log events (#37734) """""""""""""""""""""""""""""""""""""""""""""""""
The Audit Log event name for REST API events will be prepended with api. or ui., depending on if it came from the Airflow UI or externally.
Official support for Python 3.12 (#38025) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" There are a few caveats though:
Pendulum2 does not support Python 3.12. For Python 3.12 you need to use
Pendulum 3 <https://pendulum.eustace.io/blog/announcing-pendulum-3-0-0.html>_
Minimum SQLAlchemy version supported when Pandas is installed for Python 3.12 is 1.4.36 released in
April 2022. Airflow 2.9.0 increases the minimum supported version of SQLAlchemy to 1.4.36 for all
Python versions.
Not all Providers support Python 3.12. At the initial release of Airflow 2.9.0 the following providers are released without support for Python 3.12:
apache.beam - pending on Apache Beam support for 3.12 <https://github.com/apache/beam/issues/29149>_papermill - pending on Releasing Python 3.12 compatible papermill client version
including this merged issue <https://github.com/nteract/papermill/pull/771>_Prevent large string objects from being stored in the Rendered Template Fields (#38094)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
There's now a limit to the length of data that can be stored in the Rendered Template Fields.
The limit is set to 4096 characters. If the data exceeds this limit, it will be truncated. You can change this limit
by setting the [core]max_template_field_length configuration option in your airflow config.
Change xcom table column value type to longblob for MySQL backend (#38401)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Xcom table column value type has changed from blob to longblob. This will allow you to store relatively big data in Xcom but process can take a significant amount of time if you have a lot of large data stored in Xcom.
To downgrade from revision: b4078ac230a1, ensure that you don't have Xcom values larger than 65,535 bytes. Otherwise, you'll need to clean those rows or run airflow db clean xcom to clean the Xcom table.
Stronger validation for key parameter defaults in taskflow context variables (#38015) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
As for the taskflow implementation in conjunction with context variable defaults invalid parameter orders can be
generated, it is now not accepted anymore (and validated) that taskflow functions are defined with defaults
other than None. If you have done this before you most likely will see a broken DAG and a error message like
Error message: Context key parameter my_param can't have a default other than None.
New Features """"""""""""
Matomo as an option for analytics_tool. (#38221)hashable (#37465)queuedEvent endpoint to get/delete DatasetDagRunQueue (#37176)DatasetOrTimeSchedule (#36710)on_skipped_callback to BaseOperator (#36374)@task.bash TaskFlow decorator (#30176, #37875)Improvements """"""""""""
ExternalPythonOperator use version from sys.version_info (#38377)run_id column to log table (#37731)tryNumber to grid task instance tooltip (#37911)ExternalPythonOperator (#37409)Pathlike (#36947)nowait and skip_locked into with_row_locks (#36889)dag/dagRun in the REST API (#36641)Connexion from auth manager interface (#36209)Bug Fixes """""""""
total_entries count on the event logs endpoint (#38625)tz in next run ID info (#38482)chakra styles to keep dropdowns in filter bar (#38456)__exit__ is called in decorator context managers (#38383)BaseAuthManager.is_authorized_custom_view abstract (#37915)/get_logs_with_metadata endpoint (#37756)encoding to the SQL engine in SQLAlchemy v2 (#37545)consuming_dags attr eagerly before dataset listener (#36247)Miscellaneous """""""""""""
importlib_metadata with compat to Python 3.10/3.12 stdlib (#38366)__new__ magic method of BaseOperatorMeta to avoid bad mixing classic and decorated operators (#37937)sys.version_info for determine Python Major.Minor (#38372)blinker add where it requires (#38140)> 39.0.0 (#38112)assert outside of the tests (#37718)flask._request_ctx_stack (#37522)login attribute in airflow.__init__.py (#37565)datetime.datetime.utcnow by airflow.utils.timezone.utcnow in core (#35448)is_authorized_cluster_activity from auth manager (#36175)Doc Only Changes """"""""""""""""
exception to templates ref list (#36656)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
FixedTimezone (#38139)ObjectStoragePath (#37769)Miscellaneous """""""""""""
importlib_resources as it breaks pytest_rewrites (#38095, #38139)pandas to <2.2 (#37748)croniter to fix an issue with 29 Feb cron expressions (#38198)Doc Only Changes """"""""""""""""
Significant Changes ^^^^^^^^^^^^^^^^^^^
The smtp provider is now pre-installed when you install Airflow. (#37713) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Bug Fixes """""""""
Miscellaneous """""""""""""
airflow_pre_installed_providers.txt artifact (#37679)Doc Only Changes """"""""""""""""
BranchDayOfWeekOperator (#37813)ERD generating doc improvement (#37808)Significant Changes ^^^^^^^^^^^^^^^^^^^
The allowed_deserialization_classes flag now follows a glob pattern (#36147).
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
For example if one wants to add the class airflow.tests.custom_class to the
allowed_deserialization_classes list, it can be done by writing the full class
name (airflow.tests.custom_class) or a pattern such as the ones used in glob
search (e.g., airflow.*, airflow.tests.*).
If you currently use a custom regexp path make sure to rewrite it as a glob pattern.
Alternatively, if you still wish to match it as a regexp pattern, add it under the new
list allowed_deserialization_classes_regexp instead.
The audit_logs permissions have been updated for heightened security (#37501). """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
This was done under the policy that we do not want users like Viewer, Ops, and other users apart from Admin to have access to audit_logs. The intention behind this change is to restrict users with less permissions from viewing user details like First Name, Email etc. from the audit_logs when they are not permitted to.
The impact of this change is that the existing users with non admin rights won't be able to view or access the audit_logs, both from the Browse tab or from the DAG run.
AirflowTimeoutError is no longer except by default through Exception (#35653).
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The AirflowTimeoutError is now inheriting BaseException instead of
AirflowException->Exception.
See https://docs.python.org/3/library/exceptions.html#exception-hierarchy
This prevents code catching Exception from accidentally
catching AirflowTimeoutError and continuing to run.
AirflowTimeoutError is an explicit intent to cancel the task, and should not
be caught in attempts to handle the error and return some default value.
Catching AirflowTimeoutError is still possible by explicitly excepting
AirflowTimeoutError or BaseException.
This is discouraged, as it may allow the code to continue running even after
such cancellation requests.
Code that previously depended on performing strict cleanup in every situation
after catching Exception is advised to use finally blocks or
context managers. To perform only the cleanup and then automatically
re-raise the exception.
See similar considerations about catching KeyboardInterrupt in
https://docs.python.org/3/library/exceptions.html#KeyboardInterrupt
Bug Fixes """""""""
IMPORT_ERROR from DAG related permissions to view related permissions (#37292)AirflowTaskTimeout to inherit BaseException (#35653)namedtuple (#37168)Treeview function (#37162)access_entity is specified (#37290)dateTimeAttrFormat constant (#37285)@Sentry.enrich_errors (#37002)dryrun auto-fetch (#36941)/variables endpoint (#36820)pendulum.from_timestamp usage (#37160)Miscellaneous """""""""""""
CLI instead of specific one (#37651)undici from 5.26.3 to 5.28.3 in /airflow/www (#37493)3.12 exclusions in providers/pyproject.toml (#37404)markdown from core dependencies (#37396)pageSize method. (#37319)Python 3.11 and 3.12 deprecations (#37478)airflow_pre_installed_providers.txt into sdist distribution (#37388)universal-pathlib to < 0.2.0 (#37311)queue_when (#36997)config.yml for environment variable sql_alchemy_connect_args (#36526)Alembic to 1.13.1 (#36928)flask-session to <0.6 (#36895)Doc Only Changes """"""""""""""""
CLI flags available (#37231)otel config descriptions (#37229)Objectstore tutorial with prereqs section (#36983)package/module names (#36927)__init__ of operators automatically (#33786)Significant Changes ^^^^^^^^^^^^^^^^^^^
Target version for core dependency pendulum package set to 3 (#36281).
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Support for pendulum 2.1.2 will be saved for a while, presumably until the next feature version of Airflow.
It is advised to upgrade user code to use pendulum 3 as soon as possible.
Pendulum 3 introduced some subtle incompatibilities that you might rely on in your code - for example
default rendering of dates is missing T in the rendered date representation, which is not ISO8601
compliant. If you rely on the default rendering of dates, you might need to adjust your code to use
isoformat() method to render dates in ISO8601 format.
Airflow packaging specification follows modern Python packaging standards (#36537).
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
We standardized Airflow dependency configuration to follow latest development in Python packaging by
using pyproject.toml. Airflow is now compliant with those accepted PEPs:
PEP-440 Version Identification and Dependency Specification <https://www.python.org/dev/peps/pep-0440/>__PEP-517 A build-system independent format for source trees <https://www.python.org/dev/peps/pep-0517/>__PEP-518 Specifying Minimum Build System Requirements for Python Projects <https://www.python.org/dev/peps/pep-0518/>__PEP-561 Distributing and Packaging Type Information <https://www.python.org/dev/peps/pep-0561/>__PEP-621 Storing project metadata in pyproject.toml <https://www.python.org/dev/peps/pep-0621/>__PEP-660 Editable installs for pyproject.toml based builds (wheel based) <https://www.python.org/dev/peps/pep-0660/>__PEP-685 Comparison of extra names for optional distribution dependencies <https://www.python.org/dev/peps/pep-0685/>__Also we implement multiple license files support coming from Draft, not yet accepted (but supported by hatchling) PEP:
PEP 639 Improving License Clarity with Better Package Metadata <https://peps.python.org/pep-0639/>__This has almost no noticeable impact on users if they are using modern Python packaging and development tools, generally
speaking Airflow should behave as it did before when installing it from PyPI and it should be much easier to install
it for development purposes using pip install -e ".[devel]".
The differences from the user side are:
- (following PEP-685) instead of _ and .
(as it was before in some extras). When you install airflow with such extras (for example dbt.core or
all_dbs) you should use - instead of _ and ..In most modern tools this will work in backwards-compatible way, but in some old version of those tools you might need to
replace _ and . with -. You can also get warnings that the extra you are installing does not exist - but usually
this warning is harmless and the extra is installed anyway. It is, however, recommended to change to use - in extras in your dependency
specifications for all Airflow extras.
Released airflow package does not contain devel, devel-*, doc and docs-gen extras.
Those extras are only available when you install Airflow from sources in --editable mode. This is
because those extras are only used for development and documentation building purposes and are not needed
when you install Airflow for production use. Those dependencies had unspecified and varying behaviour for
released packages anyway and you were not supposed to use them in released packages.
The all and all-* extras were not always working correctly when installing Airflow using constraints
because they were also considered as development-only dependencies. With this change, those dependencies are
now properly handling constraints and they will install properly with constraints, pulling the right set
of providers and dependencies when constraints are used.
Graphviz dependency is now an optional one, not required one (#36647).
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The graphviz dependency has been problematic as Airflow required dependency - especially for
ARM-based installations. Graphviz packages require binary graphviz libraries - which is already a
limitation, but they also require to install graphviz Python bindings to be build and installed.
This does not work for older Linux installation but - more importantly - when you try to install
Graphviz libraries for Python 3.8, 3.9 for ARM M1 MacBooks, the packages fail to install because
Python bindings compilation for M1 can only work for Python 3.10+.
This is not a breaking change technically - the CLIs to render the DAGs is still there and IF you already have graphviz installed, it will continue working as it did before. The only problem when it does not work is where you do not have graphviz installed it will raise an error and inform that you need it.
Graphviz will remain to be installed for most users:
The only change will be a new installation of new version of Airflow from the scratch, where graphviz will need to be specified as extra or installed separately in order to enable DAG rendering option.
Bug Fixes """""""""
taskinstance list (#36693)AUTH_ROLE_PUBLIC=admin (#36750)op subtypes (#35536)typing.Union in _infer_multiple_outputs for Python 3.10+ (#36728)multiple_outputs is inferred correctly even when using TypedDict (#36652)Dagrun.update_state (#36712)EventsTimetable schedule past events if catchup=False (#36134)tis_query in _process_executor_events (#36655)call_regular_interval (#36608)DagRun fails while running dag test (#36517)_manage_executor_state by refreshing TIs in batch (#36502)MAX_CONTENT_LENGTH (#36401)kubernetes decorator type annotation consistent with operator (#36405)api/dag/*/dagrun from anonymous user (#36275)DAG.is_fixed_time_schedule (#36370)Miscellaneous """""""""""""
httpx import in file_task_handler for performance (#36753)hatchling build backend (#36537)pyarrow-hotfix for CVE-2023-47248 (#36697)graphviz dependency optional (#36647)pandas dependency to 1.2.5 for all providers and airflow (#36698)/airflow/www (#36700)docker decorator type annotations (#36406)batch_is_authorized_dag to check if user has permission to read DAGs (#36279)Doc Only Changes """"""""""""""""
numpy example with practical exercise demonstrating top-level code (#35097)dags.rst with information on DAG pausing (#36540)metrics.rst for param dagrun.schedule_delay (#36404)Significant Changes ^^^^^^^^^^^^^^^^^^^
Raw HTML code in DAG docs and DAG params descriptions is disabled by default (#35460)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
To ensure that no malicious javascript can be injected with DAG descriptions or trigger UI forms by DAG authors
a new parameter webserver.allow_raw_html_descriptions was added with default value of False.
If you trust your DAG authors code and want to allow using raw HTML in DAG descriptions and params, you can restore the previous
behavior by setting the configuration value to True.
To ensure Airflow is secure by default, the raw HTML support in trigger UI has been super-seeded by markdown support via
the description_md attribute. If you have been using description_html please migrate to description_md.
The custom_html_form is now deprecated.
New Features """"""""""""
AIP-58 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-58+milestone%3A%22Airflow+2.8.0%22>_)prev_end_date_success method access (#34528)List Task Instances view (#34529)clear_number to track DAG run being cleared (#34126)Improvements """"""""""""
multiselect to run state in grid view (#35403)Connection.get_hook in case of ImportError (#36005)taskinstance (#35810)AIRFLOW_CONFIG path (#35818)JSON-string connection representation generator (#35723)BaseOperatorLink into the separate module (#35032)cbreak in execute_interactive and handle SIGINT (#35602)synchronize_log_template function (#35366)BaseOperatorLink.operators (#35003)SA2-compatible syntax for TaskReschedule (#33720)EventScheduler (#34808)update_forward_refs (#34657)Dataset from airflow package in codebase (#34610)airflow.datasets.Dataset in examples and tests (#34605)version top-level element from docker compose files (#33831)NOT EXISTS subquery instead of tuple_not_in_condition (#33527)triggerer_heartbeat (#33320)airflow variables export to print to stdout (#33279)Bug Fixes """""""""
reset_user_sessions to work from either CLI or web (#36056)overscroll behaviour to auto (#35717)borderWidthRight to grid for Firefox scrollbar (#35346)processor_subdir in serialized_dag table (#35661)get_dag_by_pickle util function (#35339)mappedoperator (#35257)Literal from typing_extensions (#33794)Miscellaneous """""""""""""
4.3.10 (#35991)Connection.to_json_dict to Connection.to_dict (#35894)moto version to >= 4.2.9 (#35687)pyarrow-hotfix to mitigate CVE-2023-47248 (#35650)axios from 0.26.0 to 1.6.0 in /airflow/www/ (#35624)navbar_text_color and rm condition in style (#35553)dag_next_execution (#35539)TCH004 and TCH005 rules (#35475)AirflowException from airflow (#34541)postcss from 8.4.25 to 8.4.31 in /airflow/www (#34770)airflow.models.dag.DAG in examples (#34617)Doc Only Changes """"""""""""""""
re2 regex engine in the .airflowignore documentation. (#35663)best-practices.rst (#35692)dag-run.rst to mention Airflow's support for extended cron syntax through croniter (#35342)webserver.rst to include information of supported OAuth2 providers (#35237)rst code block format (#34708)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes """""""""
codemirror and extra (#35122)get_plugin_info for class based listeners. (#35022)all_skipped trigger rule as skipped if any task is in upstream_failed state (#34392)Misc/Internal """""""""""""
pendulum requirement to <3.0 (#35336)sentry_sdk to 1.33.0 (#35298)@babel/traverse from 7.16.0 to 7.23.2 in /airflow/www (#34988)undici from 5.19.1 to 5.26.3 in /airflow/www (#34971)SchedulerJobRunner (#34810)max_tis per query > parallelism (#34742)connexion<3.0 upper bound (#35218)< 3.12 (#35123)3.1.0 (#34943)Doc Only Changes """"""""""""""""
conn.extras (#35165)mysql-connector-python from recommended MySQL driver (#34287)set_downstream example (#35075)airflow_local_settings.py template (#34826)'>' in provider section name (#34813)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes
Bug Fixes """""""""
taskgroup is mapped (#34587)cluster_activity view not loading due to standaloneDagProcessor templating (#34274)loglevel=DEBUG in 'Not syncing DAG-level permissions' (#34268)access_control={} (#34114)ab_user table in the CLI session (#34120)next_run_datasets_summary endpoint (#34143)_run_task_session in mapped render_template_fields (#33309)version_added (#34011)Doc Only Changes """"""""""""""""
AUTH_REMOTE_USER from FAB in WSGI middleware example (#34721)Misc/Internal """""""""""""
astroid version < 3 (#34658)os.path.splitext to Path.* (#34352, #33669)pyproject.toml (#34014)isinstance in fab_security manager (#33760)isinstance calls for the same object in a single call (#33767)str.splitlines() to split lines (#33592)len() (#33454)Significant Changes ^^^^^^^^^^^^^^^^^^^
CronTriggerTimetable is now less aggressive when trying to skip a run (#33404) """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
When setting catchup=False, CronTriggerTimetable no longer skips a run if
the scheduler does not query the timetable immediately after the previous run
has been triggered.
This should not affect scheduling in most cases, but can change the behaviour if
a DAG is paused-unpaused to manually skip a run. Previously, the timetable (with
catchup=False) would only start a run after a DAG is unpaused, but with this
change, the scheduler would try to look at little bit back to schedule the
previous run that covers a part of the period when the DAG was paused. This
means you will need to keep a DAG paused longer (namely, for the entire cron
period to pass) to really skip a run.
Note that this is also the behaviour exhibited by various other cron-based
scheduling tools, such as anacron.
conf.set() becomes case insensitive to match conf.get() behavior (#33452)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Also, conf.get() will now break if used with non-string parameters.
conf.set(section, key, value) used to be case sensitive, i.e. conf.set("SECTION", "KEY", value)
and conf.set("section", "key", value) were stored as two distinct configurations.
This was inconsistent with the behavior of conf.get(section, key), which was always converting the section and key to lower case.
As a result, configuration options set with upper case characters in the section or key were unreachable.
That's why we are now converting section and key to lower case in conf.set too.
We also changed a bit the behavior of conf.get(). It used to allow objects that are not strings in the section or key.
Doing this will now result in an exception. For instance, conf.get("section", 123) needs to be replaced with conf.get("section", "123").
Bug Fixes """""""""
MappedTaskGroup tasks not respecting upstream dependency (#33732)SECURITY_MANAGER_CLASS should be a reference to class, not a string (#33690)get_url_for_login in security manager (#33660)2.7.0 db migration job errors (#33652)groupby in TIS duration calculation (#33535)dialect.name in custom SA types (#33503)end_date is less than utcnow (#33488)formatDuration method (#33486)conf.set case insensitive (#33452)soft_fail argument when poke is called (#33401)processor_subdir (#33357) text in Provider's view (#33326)soft_fail argument when ExternalTaskSensor runs in deferrable mode (#33196)expand_kwargs method (#32272)Misc/Internal """""""""""""
Pydantic 1 compatibility (#34081, #33998)Pydantic 2 (#33956)devel_only extra in Airflow's setup.py (#33907)FAB to 4.3.4 in order to fix issues with filters (#33931)sqlalchemy to 1.4.24 (#33892)OrderedDict with plain dict (#33508)Pydantic warning about orm_mode rename (#33220)Pydantic limitation for version < 2 (#33507)Doc only changes """""""""""""""""
Significant Changes ^^^^^^^^^^^^^^^^^^^
Remove Python 3.7 support (#30963) """""""""""""""""""""""""""""""""" As of now, Python 3.7 is no longer supported by the Python community. Therefore, to use Airflow 2.7.0, you must ensure your Python version is either 3.8, 3.9, 3.10, or 3.11.
Old Graph View is removed (#32958) """""""""""""""""""""""""""""""""" The old Graph View is removed. The new Graph View is the default view now.
The trigger UI form is skipped in web UI if no parameters are defined in a DAG (#33351) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
If you are using dag_run.conf dictionary and web UI JSON entry to run your DAG you should either:
Add params to your DAG <https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/params.html#use-params-to-provide-a-trigger-ui-form>_show_trigger_form_if_no_params to bring back old behaviourThe "db init", "db upgrade" commands and "[database] load_default_connections" configuration options are deprecated (#33136). """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" Instead, you should use "airflow db migrate" command to create or upgrade database. This command will not create default connections. In order to create default connections you need to run "airflow connections create-default-connections" explicitly, after running "airflow db migrate".
In case of SMTP SSL connection, the context now uses the "default" context (#33070)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The "default" context is Python's default_ssl_contest instead of previously used "none". The
default_ssl_context provides a balance between security and compatibility but in some cases,
when certificates are old, self-signed or misconfigured, it might not work. This can be configured
by setting "ssl_context" in "email" configuration of Airflow.
Setting it to "none" brings back the "none" setting that was used in Airflow 2.6 and before, but it is not recommended due to security reasons ad this setting disables validation of certificates and allows MITM attacks.
Disable default allowing the testing of connections in UI, API and CLI(#32052)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
For security reasons, the test connection functionality is disabled by default across Airflow UI,
API and CLI. The availability of the functionality can be controlled by the
test_connection flag in the core section of the Airflow
configuration (airflow.cfg). It can also be controlled by the
environment variable AIRFLOW__CORE__TEST_CONNECTION.
The following values are accepted for this config param:
Disabled: Disables the test connection functionality and
disables the Test Connection button in the UI.This is also the default value set in the Airflow configuration.
2. Enabled: Enables the test connection functionality and
activates the Test Connection button in the UI.
Hidden: Disables the test connection functionality and
hides the Test Connection button in UI.For more information on capabilities of users, see the documentation: https://airflow.apache.org/docs/apache-airflow/stable/security/security_model.html#capabilities-of-authenticated-ui-users It is strongly advised to not enable the feature until you make sure that only highly trusted UI/API users have "edit connection" permissions.
The xcomEntries API disables support for the deserialize flag by default (#32176)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
For security reasons, the /dags/*/dagRuns/*/taskInstances/*/xcomEntries/*
API endpoint now disables the deserialize option to deserialize arbitrary
XCom values in the webserver. For backward compatibility, server admins may set
the [api] enable_xcom_deserialize_support config to True to enable the
flag and restore backward compatibility.
However, it is strongly advised to not enable the feature, and perform deserialization at the client side instead.
Change of the default Celery application name (#32526)
""""""""""""""""""""""""""""""""""""""""""""""""""""""
Default name of the Celery application changed from airflow.executors.celery_executor to airflow.providers.celery.executors.celery_executor.
You should change both your configuration and Health check command to use the new name:
celery_app_name configuration in celery section) use airflow.providers.celery.executors.celery_executorairflow.providers.celery.executors.celery_executor.appThe default value for scheduler.max_tis_per_query is changed from 512 to 16 (#32572)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
This change is expected to make the Scheduler more responsive.
scheduler.max_tis_per_query needs to be lower than core.parallelism.
If both were left to their default value previously, the effective default value of scheduler.max_tis_per_query was 32
(because it was capped at core.parallelism).
To keep the behavior as close as possible to the old config, one can set scheduler.max_tis_per_query = 0,
in which case it'll always use the value of core.parallelism.
Some executors have been moved to corresponding providers (#32767) """""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" In order to use the executors, you need to install the providers:
apache-airflow-providers-celery package >= 3.3.0apache-airflow-providers-cncf-kubernetes package >= 7.4.0apache-airflow-providers-daskexecutor package in any versionYou can achieve it also by installing airflow with [celery], [cncf.kubernetes], [daskexecutor] extras respectively.
Users who base their images on the apache/airflow reference image (not slim) should be unaffected - the base
reference image comes with all the three providers installed.
Improvement Changes ^^^^^^^^^^^^^^^^^^^
PostgreSQL only improvement: Added index on taskinstance table (#30762) """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" This index seems to have great positive effect in a setup with tens of millions such rows.
New Features """"""""""""
AIP-49 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-49+milestone%3A%22Airflow+2.7.0%22>_)AIP-51 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-51+milestone%3A%22Airflow+2.7.0%22>_)AIP-52 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-52+milestone%3A%22Airflow+2.7.0%22>_)AIP-53 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+milestone%3A%22Airflow+2.7.0%22+label%3Aprovider%3Aopenlineage>_)BranchExternalPythonOperator (#32787, #33360)Per-LocalTaskJob Configuration (#32313)AirflowClusterPolicySkipDag exception (#32013)reactflow for datasets graph (#31775)chain which doesn't require matched lists (#31927)--retry and --retry-delay to airflow db check (#31836)section query param in get config rest API (#30936)Scheduled->Queued->Running task state transition times (#30612)Improvements """"""""""""
db upgrade to db migrate and add connections create-default-connections (#32810, #33136)<= parallelism (#32572)isdisjoint instead of not intersection (#32616)dag_processor status. (#32382)[triggers.running] (#32050)TriggerDagRunOperator: Add wait_for_completion to template_fields (#31122)PythonVirtualenvOperator termination log in alert (#31747)airflow db commands to SQLAlchemy 2.0 style (#31486)validators into their own modules (#30802)get_log api (#30729)Bug Fixes """""""""
Gantt chart: Use earliest/oldest ti dates if different than dag run start/end (#33215)virtualenv detection for Python virtualenv operator (#33223)chmod airflow.cfg (#33118)max_active_runs reached its upper limit. (#31414)get_task_instances query (#33054)$ref (#32887)PythonOperator sub-classes extend its decorator (#32845)virtualenv is installed in PythonVirtualenvOperator (#32939)__iter__ in is_container() (#32850)dagRunTimeout (#32565)/blocked endpoint (#32571)cli.dags.trigger command output (#32548)whitespaces from airflow connections form (#32292)readonly property in our API (#32510)resizer would not expanse grid view (#31581)type_ arg to drop_constraint (#31306)drop_constraint call in migrations (#31302)requirepass redis sentinel (#30352)/config (#31057)Misc/Internal """""""""""""
dag_processing (#33161)Pydantic to < 2.0.0 (#33235)cncf.kubernetes provider (#32767, #32891)pydocstyle check - core Airflow only (#31297)1.2.3 to 1.2.4 in /airflow/www (#32680)6.3.0 to 6.3.1 in /airflow/www (#32506)4.18.0 (#32445)stylelint from 13.13.1 to 15.10.1 in /airflow/www (#32435)4.0.0 to 4.1.3 in /airflow/www (#32443)Pydantic 2 (#32366)enums (#31735)0.272 (#31966)asynctest (#31664)2.0 style (#31569, #31772, #32350, #32339, #32474, #32645)3.7 support (#30963)0.0.262 (#30809)1.2.0 (#30687)Docs only changes """""""""""""""""
DAGRun / DAG / Task in templates-ref.rst (#33013)Significant Changes ^^^^^^^^^^^^^^^^^^^
Default allowed pattern of a run_id has been changed to ^[A-Za-z0-9_.~:+-]+$ (#32293).
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Previously, there was no validation on the run_id string. There is now a validation regex that
can be set by configuring allowed_run_id_pattern in scheduler section.
Bug Fixes """""""""
DagRun.run_id and allow flexibility (#32293)executor_class from Job - fixing backfill for custom executors (#32219)mapIndex to display extra links per mapped task. (#32154)re2 for matching untrusted regex (#32060)dag_dependencies in serialized dag (#32037)None if an XComArg fails to resolve in a multiple_outputs Task (#32027)rendered-templates when map index is not found (#32011)ExternalTaskSensor when there is no task group TIs for the current execution date (#32009)operator_extra_links property serialization in mapped tasks (#31904)xcom_pull and inlets (#31128)on_failure_callback is not invoked when task failed during testing dag. (#30965)ExternalPythonOperator and debug logging level (#30367)Misc/Internal """""""""""""
task.sensor annotation in type stub (#31954)Pydantic to < 2.0.0 until we solve 2.0.0 incompatibilities (#32312)Pydantic 2 pickiness about model definition (#32307)Doc only changes """"""""""""""""
™ to Airflow in prominent places (#31977)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes ^^^^^^^^^
map_index to the xcom key when skipping downstream tasks (#31541)/health endpoint (#31529)max_active_tis_per_dagrun for Dynamic Task Mapping (#31406)url_for_asset fallback and 404 on DAG Audit Log (#31233)default_args in nested task groups (#31608)[secrets] backend_kwargs as a sensitive config (#31788)Misc/Internal """""""""""""
Doc only changes ^^^^^^^^^^^^^^^^
Significant Changes ^^^^^^^^^^^^^^^^^^^
Clarifications of the external Health Check mechanism and using Job classes (#31277).
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
In the past SchedulerJob and other *Job classes are known to have been used to perform
external health checks for Airflow components. Those are, however, Airflow DB ORM related classes.
The DB models and database structure of Airflow are considered as internal implementation detail, following
public interface <https://airflow.apache.org/docs/apache-airflow/stable/public-airflow-interface.html>_).
Therefore, they should not be used for external health checks. Instead, you should use the
airflow jobs check CLI command (introduced in Airflow 2.1) for that purpose.
Bug Fixes ^^^^^^^^^
job_type column (#31182)api_client_retry_configuration (#31174)interleave_timestamp_parser config to the logging section (#31102)MappedTaskGroup import in taskinstance file (#31100)apache-hive extra so it installs the correct package (#31068)airflow providers get command output (#30978)pandas.DataFrame (#30943)order_by request in list DAG rest api (#30926)state and start_date from being reset when clearing a task in a running DagRun (#30125)Misc/Internal """""""""""""
Doc only changes """"""""""""""""
dag_processing.processes metric (#30891)config directory in docker compose (#30662)version_added config field for might_contain_dag and metrics_allow_list (#30969)Significant Changes ^^^^^^^^^^^^^^^^^^^
Default permissions of file task handler log directories and files has been changed to "owner + group" writeable (#29506).
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Default setting handles case where impersonation is needed and both users (airflow and the impersonated user)
have the same group set as main group. Previously the default was also other-writeable and the user might choose
to use the other-writeable setting if they wish by configuring file_task_handler_new_folder_permissions
and file_task_handler_new_file_permissions in logging section.
SLA callbacks no longer add files to the dag processor manager's queue (#30076) """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" This stops SLA callbacks from keeping the dag processor manager permanently busy. It means reduced CPU, and fixes issues where SLAs stop the system from seeing changes to existing dag files. Additional metrics added to help track queue state.
The cleanup() method in BaseTrigger is now defined as asynchronous (following async/await) pattern (#30152).
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
This is potentially a breaking change for any custom trigger implementations that override the cleanup()
method and uses synchronous code, however using synchronous operations in cleanup was technically wrong,
because the method was executed in the main loop of the Triggerer and it was introducing unnecessary delays
impacting other triggers. The change is unlikely to affect any existing trigger implementations.
The gauge scheduler.tasks.running no longer exist (#30374)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The gauge has never been working and its value has always been 0. Having an accurate
value for this metric is complex so it has been decided that removing this gauge makes
more sense than fixing it with no certainty of the correctness of its value.
Consolidate handling of tasks stuck in queued under new task_queued_timeout config (#30375)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Logic for handling tasks stuck in the queued state has been consolidated, and the all configurations
responsible for timing out stuck queued tasks have been deprecated and merged into
[scheduler] task_queued_timeout. The configurations that have been deprecated are
[kubernetes] worker_pods_pending_timeout, [celery] stalled_task_timeout, and
[celery] task_adoption_timeout. If any of these configurations are set, the longest timeout will be
respected. For example, if [celery] stalled_task_timeout is 1200, and [scheduler] task_queued_timeout
is 600, Airflow will set [scheduler] task_queued_timeout to 1200.
Improvement Changes ^^^^^^^^^^^^^^^^^^^
Display only the running configuration in configurations view (#28892)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The configurations view now only displays the running configuration. Previously, the default configuration
was displayed at the top but it was not obvious whether this default configuration was overridden or not.
Subsequently, the non-documented endpoint /configuration?raw=true is deprecated and will be removed in
Airflow 3.0. The HTTP response now returns an additional Deprecation header. The /config endpoint on
the REST API is the standard way to fetch Airflow configuration programmatically.
Explicit skipped states list for ExternalTaskSensor (#29933)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
ExternalTaskSensor now has an explicit skipped_states list
Miscellaneous Changes ^^^^^^^^^^^^^^^^^^^^^
Handle OverflowError on exponential backoff in next_run_calculation (#28172)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Maximum retry task delay is set to be 24h (86400s) by default. You can change it globally via core.max_task_retry_delay
parameter.
Move Hive macros to the provider (#28538)
"""""""""""""""""""""""""""""""""""""""""
The Hive Macros (hive.max_partition, hive.closest_ds_partition) are available only when Hive Provider is
installed. Please install Hive Provider > 5.1.0 when using those macros.
Updated app to support configuring the caching hash method for FIPS v2 (#30675)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Various updates for FIPS-compliance when running Airflow in Python 3.9+. This includes a new webserver option, caching_hash_method,
for changing the default flask caching method.
New Features ^^^^^^^^^^^^
max_active_tis_per_dagrun for Dynamic Task Mapping (#29094)TriggerDagRunOperator (#30292)Blocklist to disable specific metric tags or metric names (#29881)check_migrations config (#29714)cli.dags.trigger (#29224)db export-archived command. (#29485)airflow db drop-archived command (#29309)FileTrigger (#29265)connections import CLI command (#28738)Improvements """"""""""""
AIP-51 <https://github.com/apache/airflow/pulls?q=is%3Apr+is%3Amerged+label%3AAIP-51+milestone%3A%22Airflow+2.6.0%22>__)UX in grid view (#30373)select() to new style (#30515)metrics_*_list (#30174)on_*_callback/sla_miss_callbacks (#28469)renamed and previous_name in config sections (#28324)triggerer status (#27755)Bug Fixes """""""""
too old resource version exception by retrieving the latest resource_version (#30425)TriggerDagRunOperator with deferrable parameter (#30406)example_sensor_decorator DAG (#30513)Misc/Internal """""""""""""
skip_exit_code in BashOperator (#30734)scheduler.tasks.running (#30374)/airflow/www (#30568)/airflow/www (#30319)/airflow/www (#30316)dag.fileloc instead of dag.full_filepath in exception message (#30610)importlib-metadata backport to < 5.0.0 (#29924)importlib.metadata to get Version for speed (#29723)db export-cleaned to db export-archived (#29450)freezegun with time-machine (#28193)airflow/kubernetes/* (#28212)Doc only changes """"""""""""""""
audit_logs.rst (#30405)*_lookup_pattern parameters (#29580)Significant Changes ^^^^^^^^^^^^^^^^^^^
No significant changes.
Bug Fixes ^^^^^^^^^
dag.partial_subset doesn't mutate task group properties (#30129)airflow dags next-execution cli command (#30117)TriggerRuleDep when the mapped tasks count is 0 (#30084)Misc/Internal ^^^^^^^^^^^^^
Doc only changes ^^^^^^^^^^^^^^^^
Significant Changes ^^^^^^^^^^^^^^^^^^^
The date-time fields passed as API parameters or Params should be RFC3339-compliant (#29395) """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
In case of API calls, it was possible that "+" passed as part of the date-time fields were not URL-encoded, and
such date-time fields could pass validation. Such date-time parameters should now be URL-encoded (as %2B).
In case of parameters, we still allow IS8601-compliant date-time (so for example it is possible that
' ' was used instead of T separating date from time and no timezone was specified) but we raise
deprecation warning.
Default for [webserver] expose_hostname changed to False (#29547)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The default for [webserver] expose_hostname has been set to False, instead of True. This means administrators must opt-in to expose webserver hostnames to end users.
Bug Fixes ^^^^^^^^^
/dagRuns API should 404 if dag not active (#29860)openapi spec responses by adding additional return type (#29600)prev_logical_date variable offset-aware (#29454)Edgemodifier refactoring w/ labels in TaskGroup edge case (#29410)airflow connections add (#28922)Misc/Internal ^^^^^^^^^^^^^
undici from 5.9.1 to 5.19.1 (#29583)v67.2.0 (#29465)ua-parser-js from 0.7.31 to 0.7.33 in /airflow/www (#29172)pytest (#29086)run_id url param when linking to graph/gantt views (#29066)python_callable (#28932)swagger-ui-dist from 3.52.0 to 4.1.3 in /airflow/www (#28824)importlib-metadata backport to < 5.0.0 (#29924, #30069)Doc only changes ^^^^^^^^^^^^^^^^
merge_data() task (#29158)notes param from TriggerDagRunOperator docstring (#29298)schedule param rather than timetable in Timetables docs (#29255)Significant Changes ^^^^^^^^^^^^^^^^^^^
Trigger gevent monkeypatching via environment variable (#28283)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
If you are using gevent for your webserver deployment and used local settings to monkeypatch gevent,
you might want to replace local settings patching with an _AIRFLOW_PATCH_GEVENT environment variable
set to 1 in your webserver. This ensures gevent patching is done as early as possible.
Bug Fixes ^^^^^^^^^
swagger-ui-dist via npm package (#28788)UIAlert should_show when AUTH_ROLE_PUBLIC set (#28781)external_task_ids of ExternalTaskSensor (#28692)DetachedInstanceError when finding zombies in Dag Parsing process (#28198)divs to fix dagid copy nit on dag.html (#28643)setNote endpoints under TaskInstance in OpenAPI (#28566)CronTriggerTimetable (#28532)ensure_ascii=False in trigger dag run API (#28451)ti._try_number for deferred and up_for_reschedule tasks (#26993)callModal from dag.js (#28410)monkeypatching via environment variable (#28283)LazyXComAccess (#28191)@dag decorator are reported in dag file (#28153)dagbag_size metric decreases when files are deleted (#28135)airflow.api.auth.backend.session to backend sessions in compose (#28094)next_dagruns_to_examine, add MySQL index hint (#27821)Misc/Internal ^^^^^^^^^^^^^
dnspython after eventlet got fixed (#29004)dnspython to < 2.3.0 until eventlet incompatibility is solved (#28962)SQLAlchemy to below 2.0 (#28725)json5 from 1.0.1 to 1.0.2 in /airflow/www (#28715)Enums (#28627)Connection.get_extra type (#28594)conf.get* from the right source location (#28543)purge_inactive_dag_warnings (#28481)test_task_command to Pytest and unquarantine tests in it (#28247)0.2.0 to 0.2.2 in /airflow/www (#28080)subgraph logic (#27987)map_index (#27904)LocalTaskJob (#27381)Doc only changes ^^^^^^^^^^^^^^^^
Significant Changes ^^^^^^^^^^^^^^^^^^^
airflow dags test no longer performs a backfill job (#26400)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
In order to make airflow dags test more useful as a testing and debugging tool, we no
longer run a backfill job and instead run a "local task runner". Users can still backfill
their DAGs using the airflow dags backfill command.
Airflow config section kubernetes renamed to kubernetes_executor (#26873)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
KubernetesPodOperator no longer considers any core kubernetes config params, so this section now only applies to kubernetes executor. Renaming it reduces potential for confusion.
AirflowException is now thrown as soon as any dependent tasks of ExternalTaskSensor fails (#27190)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
ExternalTaskSensor no longer hangs indefinitely when failed_states is set, an execute_date_fn is used, and some but not all of the dependent tasks fail.
Instead, an AirflowException is thrown as soon as any of the dependent tasks fail.
Any code handling this failure in addition to timeouts should move to caching the AirflowException BaseClass and not only the AirflowSensorTimeout subclass.
The Airflow config option scheduler.deactivate_stale_dags_interval has been renamed to scheduler.parsing_cleanup_interval (#27828).
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
The old option will continue to work but will issue deprecation warnings, and will be removed entirely in Airflow 3.
New Features ^^^^^^^^^^^^
TaskRunner: notify of component start and finish (#27855)AirflowModelViews(Variables/Connection) (#24079, #27994, #27923)Is /not Null filter for value is None on webui (#26584)one_done trigger rule (#26146)Improvements ^^^^^^^^^^^^
TI.xcom_pull() with explicit task_ids and map_indexes (#27699)urlsplit (#27389)branch_task_ids into SkipMixin (#27434).first() to .scalar() (#27323)howtos about sensors (#27333)extra__conn_type__ prefix required for UI behaviors (#26995)crashloopbackoff when using hostname_callable (#24999)__future__.annotations automatically by isort (#26383)Bug Fixes ^^^^^^^^^
V1Pod in task callback (#27609)taskInstance errors and split into two tables (#26575)autoregistered DAGs if there are any import errors (#26398)from airflow import version lazy import (#26239)Misc/Internal ^^^^^^^^^^^^^
is_mapped attribute (#27881)airflow/callbacks/* airflow/cli/* (#27721)airflow/api_connexion/* directory (#27718)airflow/listener/* directory (#27731)airflow/lineage/* directory (#27732)airflow/api/* directory (#27716)minimatch from 3.0.4 to 3.0.8 in /airflow/www (#27688)1.4.1 to 1.4.2 in /airflow/www (#27697)BaseTrigger.run (#27416)2022.10.1 (#27383)memray files to gitignore / dockerignore (#27001)sphinx-autoapi (#26743)RTIF.delete_old_records() (#26667)pyupgrade edge cases (#26384)Doc only changes ^^^^^^^^^^^^^^^^
docker-compose.yaml (#26726).. spelling::
nvd
lineChart