Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

chore(deps): update dependency apache-airflow to v3#13315

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Open
renovate-bot wants to merge1 commit intoGoogleCloudPlatform:main
base:main
Choose a base branch
Loading
fromrenovate-bot:renovate/apache-airflow-3.x

Conversation

renovate-bot
Copy link
Contributor

@renovate-botrenovate-bot commentedApr 22, 2025
edited
Loading

This PR contains the following updates:

PackageChangeAgeAdoptionPassingConfidence
apache-airflow (changelog)>= 2.0.0, < 3.0.0 ->>=3.0.1, <3.1.0ageadoptionpassingconfidence
apache-airflow (changelog)==2.9.2 ->==3.0.1ageadoptionpassingconfidence
apache-airflow (changelog)==2.6.3 ->==3.0.1ageadoptionpassingconfidence
apache-airflow (changelog)==1.10.15 ->==3.0.1ageadoptionpassingconfidence

Warning

Some dependencies could not be looked up. Check the Dependency Dashboard for more information.


Release Notes

apache/airflow (apache-airflow)

v3.0.1

Compare Source

Significant Changes
^^^^^^^^^^^^^^^^^^^

No significant changes.

Bug Fixes
"""""""""

  • Improves the handling of value masking when setting Airflow variables for enhanced security (#​43123)
  • Make entire task box clickable to select the task (#​49299)
  • Vertically align task log header components in full screen mode (#​49569)
  • Removedag_code records with no serialized dag (#​49478)
  • Clear out thedag_code andserialized_dag tables on 3.0 upgrade (#​49563)
  • Remove extra slash so that the runs tab is selected (#​49600)
  • Null out thescheduler_interval field on downgrade (#​49583)
  • Logout functionality should respectbase_url in api server (#​49545)
  • Fix bug with showing invalid credentials on Login UI (#​49556)
  • Fix Dag Code text selection when dark mode is enabled (#​49649)
  • Bugfix:max_active_tis_per_dag is not respected by dynamically mapped tasks (#​49708)
  • Fix infinite redirect caused by mistakenly setting token cookie as secure (#​49721)
  • Better handle safe url redirects in login form forSimpleAuthManager (#​49697)(#​49866)
  • API: Add missingbundle_version to DagRun response (#​49726)
  • Display bundle version in Dag details tab (#​49787)
  • Fix gcp remote log module import in airflow local settings (#​49788)
  • Bugfix: Grid view stops loading when there is a pending task to be expanded (#​49772)
  • Treat singletask_ids inxcom_pull the same as multiple when provided as part of a list (#​49692)
  • UI: Auto refresh Home page stats (#​49830)
  • UI: Error alert overflows out of the alert box (#​49880)
  • Show backfill banner after creating a new backfill (#​49666)
  • MarkDAGModel stale and associate bundle on import errors to aid migration from 2.10.5 (#​49769)
  • Improve detection and handling of timed out DAG processor processes (#​49868)
  • Fix editing port for connections (#​50002)
  • Improve & Fix grid endpoint response time (#​49969)
  • Update time duration format (#​49914)
  • Fix Dashboard overflow and handle no status tasks (#​49964)
  • Fix timezone setting for logical date input on Trigger Run form (#​49662)
  • Helppip with avoiding resolution too deep issues in Python 3.12 (#​49853)
  • Bugfix: backfill dry run does not use same timezone as create backfill (#​49911)
  • Fix Edit Connection when connection is imported (#​49989)
  • Bugfix: Filtering items from a mapped task is broken (#​50011)
  • Fix Dashboard for queued DagRuns (#​49961)
  • Fix backwards-compat import path forBashSensor (#​49935)
  • Apply task group sorting based on webserver config in grid structure response (#​49418)
  • Render custommap_index_template on task completion (#​49809)
  • FixContinuousTimetable false triggering when last run ends in future (#​45175)
  • Make Trigger Dag form warning more obvious (#​49981)
  • Restore task hover and selection indicators in the Grid view (#​50050)
  • Fix datetime validation for backfills (#​50116)
  • Fix duration charts (#​50094)
  • Fix DAG node selections (#​50095)
  • UI: Fix date range field alignment (#​50086)
  • Add auto-refresh forStats (#​50088)
  • UI: Fixes validation error and adds error indicator for Params form (#​50127)
  • fix: wrap overflowing texts of asset events (#​50173)
  • Add audit log extra to table and improve UX (#​50100)
  • Handle map indexes for MappedTaskGroup (#​49996)
  • Do not use introspection in migration to fix offline SQL generation (#​49873)
  • Fix operator extra links for mapped tasks (#​50238)
  • Fix backfill form (#​50249)(#​50243)
  • UI: Fix operator overflow in graph (#​50252)
  • UI: PassmapIndex to clear the relevant task instances. (#​50256)
  • Fix markdown rendering on dag docs (#​50142)

Miscellaneous
"""""""""""""

  • AddSTRAIGHT_JOIN prefix for MySQL query optimization inget_sorted_triggers (#​46303)
  • Ensuresqlalchemy[asyncio] extra is in core deps (#​49452)
  • Remove unused constantHANDLER_SUPPORTS_TRIGGERER (#​49370)
  • Remove sort indicators on XCom table to avoid confusion (#​49547)
  • Removegitpython as a core dependency (#​49537)
  • Bump@babel/runtime from7.26.0 to7.27.0 (#​49479)
  • Add backwards compatibility shim forget_current_context (#​49630)
  • AIP-38: enhance layout forRunBackfillForm (#​49609)
  • AIP-38: merge Backfill and Trigger Dag Run (#​49490)
  • Add count to Stats Cards in Dashboard (#​49519)
  • Add auto-refresh to health section for live updates. (#​49645)
  • Tweak Execution API OpenAPI spec to improve code Generation (#​49700)
  • Stricter validation forbackfill_id (#​49691)(#​49716)
  • AddSimpleAllAdminMiddleware to allow api usage without auth header in request (#​49599)
  • Bumpreact-router andreact-router-dom from 7.4.0 to 7.5.2 (#​49742)
  • Remove reference toroot_dag_id in dagbag and restore logic (#​49668)
  • Fix a few SqlAlchemy deprecation warnings (#​49477)
  • Make default execution server URL be relative to API Base URL (#​49747)(#​49782)
  • Commonairflow.cfg files across all containers in defaultdocker-compose.yaml (#​49681)
  • Add redirects for old operators location to standard provider (#​49776)
  • Bump packaging from 24.2 to 25.0 in/airflow-core (#​49512)
  • Move some non-core dependencies to theapache-airflow meta package (#​49846)
  • Add more lower-bind limits to address resolution too deep (#​49860)
  • UI: Add counts to pool bar (#​49894)
  • Add type hints for@task.kuberenetes_cmd (#​46913)
  • Bumpvite from5.4.17 to5.4.19 for Airflow UI (#​49162)(#​50074)
  • Addmap_index filter option toGetTICount andGetTaskStates (#​49818)
  • Addstats ui endpoint (#​49985)
  • Add link to tag to filter dags associated with the tag (#​49680)
  • Add keyboard shortcut for full screen and wrap in logs. (#​50008)
  • Update graph node styling to decrease border width on tasks in UI (#​50047) (#​50073)
  • Allow non-string valid JSON values in Variable import. (#​49844)
  • Bump min versions of crucial providers (#​50076)
  • Addstate attribute toRuntimeTaskInstance for easierti.state access in Task Context (#​50031)
  • Move SQS message queue to Amazon provider (#​50057)
  • Execution API: Improve task instance logging with structlog context (#​50120)
  • Adddag_run_conf toRunBackfillForm (#​49763)
  • Refactor Dashboard to enhance layout (#​50026)
  • Add the download button on the assets page (#​50045)
  • AdddateInterval validation and error handling (#​50072)
  • AddTask Instances [{map_index}] tab to mapped task details (#​50085)
  • Add focus view on grid and graph on second click (#​50125)
  • Add formatted extra to asset events (#​50124)
  • Move webserver expose config to api section (#​50209)

Doc Only Changes
""""""""""""""""

v3.0.0

Compare Source

We are proud to announce the General Availability of Apache Airflow 3.0 — the most significant release in the project's
history. This version introduces a service-oriented architecture, a stable DAG authoring interface, expanded support for
event-driven and ML workflows, and a fully modernized UI built on React. Airflow 3.0 reflects years of community
investment and lays the foundation for the next era of scalable, modular orchestration.

Highlights
^^^^^^^^^^

  • Service-Oriented Architecture: A new Task Execution API andairflow api-server enable task execution in remote environments with improved isolation and flexibility (AIP-72).

  • Edge Executor: A new executor that supports distributed, event-driven, and edge-compute workflows (AIP-69), now generally available.

  • Stable Authoring Interface: DAG authors should now use the newairflow.sdk namespace to import core DAG constructs like@dag,@task, andDAG.

  • Scheduler-Managed Backfills: Backfills are now scheduled and tracked like regular DAG runs, with native UI and API support (AIP-78).

  • DAG Versioning: Airflow now tracks structural changes to DAGs over time, enabling inspection of historical DAG definitions via the UI and API (AIP-66).

  • Asset-Based Scheduling: The dataset model has been renamed and redesigned as assets, with a new@asset decorator and cleaner event-driven DAG definition (AIP-74, AIP-75).

  • Support for ML and AI Workflows: DAGs can now run withlogical_date=None, enabling use cases such as model inference, hyperparameter tuning, and non-interval workflows (AIP-83).

  • Removal of Legacy Features: SLAs, SubDAGs, DAG and Xcom pickling, and several internal context variables have been removed. Use the upgrade tools to detect deprecated usage.

  • Split CLI and API Changes: The CLI has been split intoairflow andairflowctl (AIP-81), and REST API now defaults tological_date=None when triggering a new DAG run.

  • Modern React UI: A complete UI overhaul built on React and FastAPI includes version-aware views, backfill management, and improved DAG and task introspection (AIP-38, AIP-84).

  • Migration Tooling: Useruff andairflow config update to validate DAGs and configurations. Upgrade requires Airflow 2.7 or later and Python 3.9–3.12.

Significant Changes
^^^^^^^^^^^^^^^^^^^

Airflow 3.0 introduces the most significant set of changes since the 2.0 release, including architectural shifts, new
execution models, and improvements to DAG authoring and scheduling.

Task Execution API & Task SDK (AIP-72)
""""""""""""""""""""""""""""""""""""""

Airflow now supports a service-oriented architecture, enabling tasks to be executed remotely via a new Task Execution
API. This API decouples task execution from the scheduler and introduces a stable contract for running tasks outside of
Airflow's traditional runtime environment.

To support this, Airflow introduces the Task SDK — a lightweight runtime environment for running Airflow tasks in
external systems such as containers, edge environments, or other runtimes. This lays the groundwork for
language-agnostic task execution and brings improved isolation, portability, and extensibility to Airflow-based
workflows.

Airflow 3.0 also introduces a newairflow.sdk namespace that exposes the core authoring interfaces for defining DAGs
and tasks. DAG authors should now import objects likeDAG,@dag, and@task fromairflow.sdk rather than
internal modules. This new namespace provides a stable, forward-compatible interface for DAG authoring across future
versions of Airflow.

Edge Executor (AIP-69)
""""""""""""""""""""""

Airflow 3.0 introduces theEdge Executor as a generally available feature, enabling execution of tasks in
distributed or remote compute environments. Designed for event-driven and edge-compute use cases, the Edge Executor
integrates with the Task Execution API to support task orchestration beyond the traditional Airflow runtime. This
advancement facilitates hybrid and cross-environment orchestration patterns, allowing task workers to operate closer to
data or application layers.

Scheduler-Managed Backfills (AIP-78)
""""""""""""""""""""""""""""""""""""

Backfills are now fully managed by the scheduler, rather than being launched as separate command-line jobs. This change
unifies backfill logic with regular DAG execution and ensures that backfill runs follow the same scheduling, versioning,
and observability models as other DAG runs.

Airflow 3.0 also introduces native UI and REST API support for initiating and monitoring backfills, making them more
accessible and easier to integrate into automated workflows. These improvements lay the foundation for smarter, safer
historical reprocessing — now available directly through the Airflow UI and API.

DAG Versioning (AIP-66)
"""""""""""""""""""""""

Airflow 3.0 introduces native DAG versioning. DAG structure changes (e.g., renamed tasks, dependency shifts) are now
tracked directly in the metadata database. This allows users to inspect historical DAG structures through the UI and API,
and lays the foundation for safer backfills, improved observability, and runtime-determined DAG logic.

React UI Rewrite (AIP-38, AIP-84)
"""""""""""""""""""""""""""""""""

Airflow 3.0 ships with a completely redesigned user interface built on React and FastAPI. This modern architecture
improves responsiveness, enables more consistent navigation across views, and unlocks new UI capabilities — including
support for DAG versioning, asset-centric DAG definitions, and more intuitive filtering and search.

The new UI replaces the legacy Flask-based frontend and introduces a foundation for future extensibility and community
contributions.

Asset-Based Scheduling & Terminology Alignment (AIP-74, AIP-75)
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""

The concept ofDatasets has been renamed toAssets, unifying terminology with common practices in the modern
data ecosystem. The internal model has also been reworked to better support future features like asset partitions and
validations.

The@asset decorator and related changes to the DAG parser enable clearer, asset-centric DAG definitions, allowing
Airflow to more naturally support event-driven and data-aware scheduling patterns.

This renaming impacts modules, classes, functions, configuration keys, and internal models. Key changes include:

  • DatasetAsset
  • DatasetEventAssetEvent
  • DatasetAliasAssetAlias
  • airflow.datasets.*airflow.sdk.*
  • airflow.timetables.simple.DatasetTriggeredTimetableairflow.timetables.simple.AssetTriggeredTimetable
  • airflow.timetables.datasets.DatasetOrTimeScheduleairflow.timetables.assets.AssetOrTimeSchedule
  • airflow.listeners.spec.dataset.on_dataset_createdairflow.listeners.spec.asset.on_asset_created
  • airflow.listeners.spec.dataset.on_dataset_changedairflow.listeners.spec.asset.on_asset_changed
  • core.dataset_manager_classcore.asset_manager_class
  • core.dataset_manager_kwargscore.asset_manager_kwargs

Unified Scheduling Field
""""""""""""""""""""""""

Airflow 3.0 removes the legacyschedule_interval andtimetable parameters. DAGs must now use the unified
schedule field for all time- and event-based scheduling logic. This simplifies DAG definition and improves
consistency across scheduling paradigms.

Updated Scheduling Defaults
"""""""""""""""""""""""""""

Airflow 3.0 changes the default behavior for new DAGs by settingcatchup_by_default = False in the configuration
file. This means DAGs that do not explicitly setcatchup=... will no longer backfill missed intervals by default.
This change reduces confusion for new users and better reflects the growing use of on-demand and event-driven workflows.

The default DAG schedule has been changed toNone from@once.

Restricted Metadata Database Access
"""""""""""""""""""""""""""""""""""

Task code can no longer directly access the metadata database. Interactions with DAG state, task history, or DAG runs
must be performed via the Airflow REST API or exposed context. This change improves architectural separation and enables
remote execution.

Future Logical Dates No Longer Supported
"""""""""""""""""""""""""""""""""""""""""

Airflow no longer supports triggering DAG runs with a logical date in the future. This change aligns with the logical
execution model and removes ambiguity in backfills and event-driven DAGs. Uselogical_date=None to trigger runs with
the current timestamp.

Context Behavior for Asset and Manually Triggered DAGs
""""""""""""""""""""""""""""""""""""""""""""""""""""""

For DAG runs triggered by an Asset event or through the REST API without specifying alogical_date, Airflow now sets
logical_date=None by default. These DAG runs do not have a data interval, and attempting to access
data_interval_start,data_interval_end, orlogical_date from the task context will raise aKeyError.

DAG authors should usedag_run.logical_date and perform appropriate checks or fallbacks if supporting multiple
trigger types. This change improves consistency with event-driven semantics but may require updates to existing DAGs
that assume these values are always present.

Improved Callback Behavior
""""""""""""""""""""""""""

Airflow 3.0 refines task callback behavior to improve clarity and consistency. In particular,on_success_callback is
no longer executed when a task is marked asSKIPPED, aligning it more closely with expected semantics.

Updated Default Configuration
"""""""""""""""""""""""""""""

Several default configuration values have been updated in Airflow 3.0 to better reflect modern usage patterns and
simplify onboarding:

  • catchup_by_default is now set toFalse by default. DAGs will not automatically backfill unless explicitly configured to do so.
  • create_cron_data_intervals is now set toFalse by default. As a result, cron expressions will be interpreted using theCronTriggerTimetable instead of the legacyCronDataIntervalTimetable.
  • SimpleAuthManager is now the defaultauth_manager. To continue using Flask AppBuilder-based authentication, install theapache-airflow-providers-flask-appbuilder provider and explicitly setauth_manager = airflow.providers.fab.auth_manager.FabAuthManager.

These changes represent the most significant evolution of the Airflow platform since the release of 2.0 — setting the
stage for more scalable, event-driven, and language-agnostic orchestration in the years ahead.

Executor & Scheduler Updates
^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Airflow 3.0 introduces several important improvements and behavior changes in how DAGs and tasks are scheduled,
prioritized, and executed.

Standalone DAG Processor Required
"""""""""""""""""""""""""""""""""

Airflow 3.0 now requires the standalone DAG processor to parse DAGs. This dedicated process improves scheduler
performance, isolation, and observability. It also simplifies architecture by clearly separating DAG parsing from
scheduling logic. This change may affect custom deployments that previously used embedded DAG parsing.

Priority Weight Capped by Pool Slots
"""""""""""""""""""""""""""""""""""""

Thepriority_weight value on a task is now capped by the number of available pool slots. This ensures that resource
availability remains the primary constraint in task execution order, preventing high-priority tasks from starving others
when resource contention exists.

Teardown Task Handling During DAG Termination
"""""""""""""""""""""""""""""""""""""""""""""

Teardown tasks will now be executed even when a DAG run is terminated early. This ensures that cleanup logic is
respected, improving reliability for workflows that use teardown tasks to manage ephemeral infrastructure, temporary
files, or downstream notifications.

Improved Scheduler Fault Tolerance
""""""""""""""""""""""""""""""""""

Scheduler components now userun_with_db_retries to handle transient database issues more gracefully. This enhances
Airflow's fault tolerance in high-volume environments and reduces the likelihood of scheduler restarts due to temporary
database connection problems.

Mapped Task Stats Accuracy
"""""""""""""""""""""""""""

Airflow 3.0 fixes a bug that caused incorrect task statistics to be reported for dynamic task mapping. Stats now
accurately reflect the number of mapped task instances and their statuses, improving observability and debugging for
dynamic workflows.

SequentialExecutor has been removed
"""""""""""""""""""""""""""""""""""""""

SequentialExecutor was primarily used for local testing but is now redundant, asLocalExecutor
supports SQLite with WAL mode and provides better performance with parallel execution.
Users should switch toLocalExecutor orCeleryExecutor as alternatives.

DAG Authoring Enhancements
^^^^^^^^^^^^^^^^^^^^^^^^^^

Airflow 3.0 includes several changes that improve consistency, clarity, and long-term stability for DAG authors.

New Stable DAG Authoring Interface:airflow.sdk
"""""""""""""""""""""""""""""""""""""""""""""""""""

Airflow 3.0 introduces a new, stable public API for DAG authoring under theairflow.sdk namespace,
available via theapache-airflow-task-sdk package.

The goal of this change is todecouple DAG authoring from Airflow internals (Scheduler, API Server, etc.),
providing aforward-compatible, stable interface for writing and maintaining DAGs across Airflow versions.

DAG authors should now import core constructs fromairflow.sdk rather than internal modules.

Key Imports fromairflow.sdk:

  • Classes:

    • Asset
    • BaseNotifier
    • BaseOperator
    • BaseOperatorLink
    • BaseSensorOperator
    • Connection
    • Context
    • DAG
    • EdgeModifier
    • Label
    • ObjectStoragePath
    • Param
    • TaskGroup
    • Variable
  • Decorators and Functions:

    • @asset
    • @dag
    • @setup
    • @task
    • @task_group
    • @teardown
    • chain
    • chain_linear
    • cross_downstream
    • get_current_context
    • get_parsing_context

For an exhaustive list of available classes, decorators, and functions, checkairflow.sdk.__all__.

All DAGs should update imports to useairflow.sdk instead of referencing internal Airflow modules directly.
Legacy import paths (e.g.,airflow.models.dag.DAG,airflow.decorator.task) aredeprecated and
will beremoved in a future Airflow version. Some additional utilities and helper functions
that DAGs sometimes use fromairflow.utils.* and others will be progressively migrated to the Task SDK in future
minor releases.

These future changes aim tocomplete the decoupling of DAG authoring constructs
from internal Airflow services. DAG authors should expect continued improvements
toairflow.sdk with no backwards-incompatible changes to existing constructs.

For example, update:

.. code-block:: python

Old (Airflow 2.x)

from airflow.models import DAGfrom airflow.decorators import task

New (Airflow 3.x)

from airflow.sdk import DAG, task

Renamed Parameter:fail_stopfail_fast
"""""""""""""""""""""""""""""""""""""""""""""""""

The DAG argumentfail_stop has been renamed tofail_fast for improved clarity. This parameter controls whether a
DAG run should immediately stop execution when a task fails. DAG authors should update any code referencing
fail_stop to use the new name.

Context Cleanup and Parameter Removal
"""""""""""""""""""""""""""""""""""""

Several legacy context variables have been removed or may no longer be available in certain types of DAG runs,
including:

  • conf
  • execution_date
  • dag_run.external_trigger

In asset-triggered and manually triggered DAG runs withlogical_date=None, data interval fields such as
data_interval_start anddata_interval_end may not be present in the task context. DAG authors should use
explicit references such asdag_run.logical_date and conditionally check for the presence of interval-related fields
where applicable.

Task Context Utilities Moved
""""""""""""""""""""""""""""

Internal task context functions such asget_parsing_context have been moved to a more appropriate location (e.g.,
airflow.models.taskcontext). DAG authors using these utilities directly should update import paths accordingly.

Trigger Rule Restrictions
"""""""""""""""""""""""""

TheTriggerRule.ALWAYS rule can no longer be used with teardown tasks or tasks that are expected to honor upstream
dependency semantics. DAG authors should ensure that teardown logic is defined with the appropriate trigger rules for
consistent task resolution behavior.

Asset Aliases for Reusability
"""""""""""""""""""""""""""""

A new utility function,create_asset_aliases(), allows DAG authors to define reusable aliases for frequently
referenced Assets. This improves modularity and reuse across DAG files and is particularly helpful for teams adopting
asset-centric DAGs.

Operator Links interface changed
""""""""""""""""""""""""""""""""

The Operator Extra links, which can be defined either via plugins or custom operators
now do not execute any user code in the Airflow UI, but instead push the "full"
links to XCom backend and the link is fetched from the XCom backend when viewing
task details, for example from grid view.

Example for users with custom links class:

.. code-block:: python

@​attr.s(auto_attribs=True)
class CustomBaseIndexOpLink(BaseOperatorLink):
"""Custom Operator Link for Google BigQuery Console."""

  index: int = attr.ib()  @&#8203;property  def name(self) -> str:      return f"BigQuery Console #{self.index + 1}"  @&#8203;property  def xcom_key(self) -> str:      return f"bigquery_{self.index + 1}"  def get_link(self, operator, *, ti_key):      search_queries = XCom.get_one(          task_id=ti_key.task_id, dag_id=ti_key.dag_id, run_id=ti_key.run_id, key="search_query"      )      return f"https://console.cloud.google.com/bigquery?j={search_query}"

The link has anxcom_key defined, which is how it will be stored in the XCOM backend, with key as xcom_key and
value as the entire link, this case:https://console.cloud.google.com/bigquery?j=search

Plugins no longer support adding executors, operators & hooks
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""

Operator (including Sensors), Executors & Hooks can no longer be registered or imported via Airflow's plugin mechanism. These types of classes
are just treated as plain Python classes by Airflow, so there is no need to register them with Airflow. They
can be imported directly from their respective provider packages.

Before:

.. code-block:: python

from airflow.hooks.my_plugin import MyHook

You should instead import it as:

.. code-block:: python

from my_plugin import MyHook

Support for ML & AI Use Cases (AIP-83)
"""""""""""""""""""""""""""""""""""""""

Airflow 3.0 expands the types of DAGs that can be expressed by removing the constraint that each DAG run must correspond
to a unique data interval. This change, introduced in AIP-83, enables support for workflows that don't operate on a
fixed schedule — such as model training, hyperparameter tuning, and inference tasks.

These ML- and AI-oriented DAGs often run ad hoc, are triggered by external systems, or need to execute multiple times
with different parameters over the same dataset. By allowing multiple DAG runs withlogical_date=None, Airflow now
supports these scenarios natively without requiring workarounds.

Config & Interface Changes
^^^^^^^^^^^^^^^^^^^^^^^^^^

Airflow 3.0 introduces several configuration and interface updates that improve consistency, clarify ownership of core
utilities, and remove legacy behaviors that were no longer aligned with modern usage patterns.

Default Value Handling
""""""""""""""""""""""

Airflow no longer silently updates configuration options that retain deprecated default values. Users are now required
to explicitly set any config values that differ from the current defaults. This change improves transparency and
prevents unintentional behavior changes during upgrades.

Refactored Config Defaults
"""""""""""""""""""""""""""

Several configuration defaults have changed in Airflow 3.0 to better reflect modern usage patterns:

  • The default value ofcatchup_by_default is nowFalse. DAGs will not backfill missed intervals unless explicitly configured to do so.
  • The default value ofcreate_cron_data_intervals is nowFalse. Cron expressions are now interpreted using theCronTriggerTimetable instead of the legacyCronDataIntervalTimetable. This change simplifies interval logic and aligns with the future direction of Airflow's scheduling system.

Refactored Internal Utilities
"""""""""""""""""""""""""""""

Several core components have been moved to more intuitive or stable locations:

  • TheSecretsMasker class has been relocated toairflow.sdk.execution_time.secrets_masker.
  • TheObjectStoragePath utility previously located underairflow.io is now available viaairflow.sdk.

These changes simplify imports and reflect broader efforts to stabilize utility interfaces across the Airflow codebase.

Improvedinlet_events,outlet_events, andtriggering_asset_events
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""

Asset event mappings in the task context are improved to better support asset use cases, including new features introduced in AIP-74.

Events of an asset or asset alias are now accessed directly by a concrete object to avoid ambiguity. Using astr to access events is
no longer supported. Use anAsset orAssetAlias object, orAsset.ref to refer to an entity explicitly instead, such as::

outlet_events[Asset.ref(name="myasset")]  # Get events for asset named "myasset".outlet_events[AssetAlias(name="myalias")]  # Get events for asset alias named "myalias".

Alternatively, two helpersfor_asset andfor_asset_alias are added as shortcuts::

outlet_events.for_asset(name="myasset")  # Get events for asset named "myasset".outlet_events.for_asset_alias(name="myalias")  # Get events for asset alias named "myalias".

The internal representation of asset event triggers now also includes an explicituri field, simplifying traceability and
aligning with the broader asset-aware execution model introduced in Airflow 3.0. DAG authors interacting directly with
inlet_events may need to update logic that assumes the previous structure.

Behaviour change inxcom_pull
"""""""""""""""""""""""""""""""""

In Airflow 2, thexcom_pull() method allowed pulling XComs by key without specifying task_ids, despite the fact that the underlying
DB model defines task_id as part of the XCom primary key. This created ambiguity: if two tasks pushed XComs with the same key,
xcom_pull() would pull whichever one happened to be first, leading to unpredictable behavior.

Airflow 3 resolves this inconsistency by requiringtask_ids when pulling by key. This change aligns with the task-scoped nature of
XComs as defined by the schema, ensuring predictable and consistent behavior.

DAG Authors should update their dags to usetask_ids if their dags usedxcom_pull withouttask_ids such as::

kwargs["ti"].xcom_pull(key="key")

Should be updated to::

kwargs["ti"].xcom_pull(task_ids="task1", key="key")

Removed Configuration Keys
"""""""""""""""""""""""""""

As part of the deprecation cleanup, several legacy configuration options have been removed. These include:

  • [scheduler] allow_trigger_in_future
  • [scheduler] use_job_schedule
  • [scheduler] use_local_tz
  • [scheduler] processor_poll_interval
  • [logging] dag_processor_manager_log_location
  • [logging] dag_processor_manager_log_stdout
  • [logging] log_processor_filename_template

All the webserver configurations have also been removed since API server now replaces webserver, so
the configurations like below have no effect:

  • [webserver] allow_raw_html_descriptions
  • [webserver] cookie_samesite
  • [webserver] error_logfile
  • [webserver] access_logformat
  • [webserver] web_server_master_timeout
  • etc

Several configuration options previously located under the[webserver] section have
beenmoved to the new[api] section. The following configuration keys have been moved:

  • [webserver] web_server_host[api] host
  • [webserver] web_server_port[api] port
  • [webserver] workers[api] workers
  • [webserver] web_server_worker_timeout[api] worker_timeout
  • [webserver] web_server_ssl_cert[api] ssl_cert
  • [webserver] web_server_ssl_key[api] ssl_key
  • [webserver] access_logfile[api] access_logfile

Users should review theirairflow.cfg files or use theairflow config lint command to identify outdated or
removed options.

Upgrade Tooling
""""""""""""""""

Airflow 3.0 includes improved support for upgrade validation. Use the following tools to proactively catch incompatible
configs or deprecated usage patterns:

  • airflow config lint: Identifies removed or invalid config keys
  • ruff check --select AIR30 --preview: Flags removed interfaces and common migration issues

CLI & API Changes
^^^^^^^^^^^^^^^^^

Airflow 3.0 introduces changes to both the CLI and REST API interfaces to better align with service-oriented deployments
and event-driven workflows.

Split CLI Architecture (AIP-81)
"""""""""""""""""""""""""""""""

The Airflow CLI has been split into two distinct interfaces:

  • The coreairflow CLI now handles only local functionality (e.g.,airflow tasks test,airflow dags list).
  • Remote functionality, including triggering DAGs or managing connections in service-mode environments, is now handled by a separate CLI calledairflowctl, distributed via theapache-airflow-client package.

This change improves security and modularity for deployments that use Airflow in a distributed or API-first context.

REST API v2 replaces v1
"""""""""""""""""""""""

The legacy REST API v1, previously built with Connexion and Marshmallow, has been replaced by a modern FastAPI-based REST API v2.

This new implementation improves performance, aligns more closely with web standards, and provides a consistent developer experience across the API and UI.

Key changes include stricter validation (422 errors instead of 400), the removal of theexecution_date parameter in favor oflogical_date, and more consistent query parameter handling.

The v2 API is now the stable, fully supported interface for programmatic access to Airflow, and also powers the new UI - achieving full feature parity between the UI and API.

For details, see the :doc:Airflow REST API v2 </stable-rest-api-ref> documentation.

REST API: DAG Trigger Behavior Updated
""""""""""""""""""""""""""""""""""""""

The behavior of thePOST /dags/{dag_id}/dagRuns endpoint has changed. If alogical_date is not explicitly
provided when triggering a DAG via the REST API, it now defaults toNone.

This aligns with event-driven DAGs and manual runs in Airflow 3.0, but may break backward compatibility with scripts or
tools that previously relied on Airflow auto-generating a timestampedlogical_date.

Removed CLI Flags and Commands
""""""""""""""""""""""""""""""

Several deprecated CLI arguments and commands that were marked for removal in earlier versions have now been cleaned up
in Airflow 3.0. Runairflow --help to review the current set of available commands and arguments.

  • Deprecated--ignore-depends-on-past cli option is replaced by--depends-on-past ignore.

  • --tree flag forairflow tasks list command is removed. The format of the output with that flag can be
    expensive to generate and extremely large, depending on the DAG.airflow dag show is a better way to
    visualize the relationship of tasks in a DAG.

  • Changingdag_id from flag (-d,--dag-id) to a positional argument in thedags list-runs CLI command.

  • Theairflow db init andairflow db upgrade commands have been removed. Useairflow db migrate instead
    to initialize or migrate the metadata database. If you would like to create default connections use
    airflow connections create-default-connections.

  • airflow api-server has replacedairflow webserver cli command.

Provider Refactor & Standardization
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Airflow 3.0 completes the migration of several core operators, sensors, and hooks into the new
apache-airflow-providers-standard package. This package now includes commonly used components such as:

  • PythonOperator
  • BashOperator
  • EmailOperator
  • SimpleHttpOperator
  • ShortCircuitOperator

These operators were previously bundled insideairflow-core but are now treated as provider-managed components to
improve modularity, testability, and lifecycle independence.

This change enables more consistent versioning across providers and prepares Airflow for a future where all integrations
— including "standard" ones — follow the same interface model.

To maintain compatibility with existing DAGs, theapache-airflow-providers-standard package is installable on both
Airflow 2.x and 3.x. Users upgrading from Airflow 2.x are encouraged to begin updating import paths and testing provider
installation in advance of the upgrade.

Legacy imports such asairflow.operators.python.PythonOperator are deprecated and will be removed soon. They should be
replaced with:

.. code-block:: python

from airflow.providers.standard.operators.python import PythonOperator

UI & Usability Improvements
^^^^^^^^^^^^^^^^^^^^^^^^^^^

Airflow 3.0 introduces a modernized user experience that complements the new React-based UI architecture (see
Significant Changes). Several areas of the interface have been enhanced to improve visibility, consistency, and
navigability.

New Home Page
"""""""""""""

The Airflow Home page now provides a high-level operational overview of your environment. It includes health checks for
core components (Scheduler, Triggerer, DAG Processor), summary stats for DAG and task instance states, and a real-time
feed of asset-triggered events. This view helps users quickly identify pipeline health, recent activity, and potential
failures.

Unified DAG List View
""""""""""""""""""""""

The DAG List page has been refreshed with a cleaner layout and improved responsiveness. Users can browse DAGs by name,
tags, or owners. While full-text search has not yet been integrated, filters and navigation have been refined for
clarity in large deployments.

Version-Aware Graph and Grid Views
"""""""""""""""""""""""""""""""""""

The Graph and Grid views now display task information in the context of the DAG version that was used at runtime. This
improves traceability for DAGs that evolve over time and provides more accurate debugging of historical runs.

Expanded DAG Graph Visualization
""""""""""""""""""""""""""""""""

The Graph view now supports visualizing the full chain of asset and task dependencies, including assets consumed or
produced across DAG boundaries. This allows users to inspect upstream and downstream lineage in a unified view, making
it easier to trace data flows, debug triggering behavior, and understand conditional dependencies between assets and
tasks.

DAG Code View
"""""""""""""

The "Code" tab now displays the exact DAG source as parsed by the scheduler for the selected DAG version. This allows
users to inspect the precise code that was executed, even for historical runs, and helps debug issues related to
versioned DAG changes.

Improved Task Log Access
"""""""""""""""""""""""""

Task log access has been streamlined across views. Logs are now easier to access from both the Grid and Task Instance
pages, with cleaner formatting and reduced visual noise.

Enhanced Asset and Backfill Views
""""""""""""""""""""""""""""""""""

New UI components support asset-centric DAGs and backfill workflows:

  • Asset definitions are now visible from the DAG details page, allowing users to inspect upstream and downstream asset relationships.
  • Backfills can be triggered and monitored directly from the UI, including support for scheduler-managed backfills introduced in Airflow 3.0.

These improvements make Airflow more accessible to operators, data engineers, and stakeholders working across both
time-based and event-driven workflows.

Deprecations & Removals
^^^^^^^^^^^^^^^^^^^^^^^^

A number of deprecated features, modules, and interfaces have been removed in Airflow 3.0, completing long-standing
migrations and cleanups.

Users are encouraged to review the following removals to ensure compatibility:

  • SubDag support has been removed entirely, including theSubDagOperator, related CLI and API interfaces. TaskGroups are now the recommended alternative for nested DAG structures.

  • SLAs have been removed: The legacy SLA feature, including SLA callbacks and metrics, has been removed. A more flexible replacement mechanism,DeadlineAlerts, is planned for a future version of Airflow. Users who relied on SLA-based notifications should consider implementing custom alerting using task-level success/failure hooks or external monitoring integrations.

  • Pickling support has been removed: All legacy features related to DAG pickling have been fully removed. This includes thePickleDag CLI/API, as well as implicit behaviors aroundstore_serialized_dags = False. DAGs must now be serialized using the JSON-based serialization system. Ensure any custom Python objects used in DAGs are JSON-serializable.

  • Context parameter cleanup: Several previously available context variables have been removed from the task execution context, includingconf,execution_date, anddag_run.external_trigger. These values are either no longer applicable or have been renamed (e.g., usedag_run.logical_date instead ofexecution_date). DAG authors should ensure that templated fields and Python callables do not reference these deprecated keys.

  • Deprecated core imports have been fully removed. Any use ofairflow.operators.*,airflow.hooks.*, or similar legacy import paths should be updated to import from their respective providers.

  • Configuration cleanup: Several legacy config options have been removed, including:

    • scheduler.allow_trigger_in_future: DAG runs can no longer be triggered with a future logical date. Uselogical_date=None instead.
    • scheduler.use_job_schedule andscheduler.use_local_tz have also been removed. These options were deprecated and no longer had any effect.
  • Deprecated utility methods such as those inairflow.utils.helpers,airflow.utils.process_utils, andairflow.utils.timezone have been removed. Equivalent functionality can now be found in the standard Python library or Airflow provider modules.

  • Removal of deprecated CLI flags and behavior: Several CLI entrypoints and arguments that were marked for removal in earlier versions have been cleaned up.

To assist with the upgrade, tools likeruff (e.g., ruleAIR302) andairflow config lint can help identify
obsolete imports and configuration keys. These utilities are recommended for locating and resolving common
incompatibilities during migration. Please see :doc:Upgrade Guide <installation/upgrading_to_airflow3> for more
information.

Summary of Removed Features
"""""""""""""""""""""""""""

The following table summarizes user-facing features removed in 3.0 and their recommended replacements. Not all of these
are called out individually above.

+-------------------------------------------+----------------------------------------------------------+
|Feature |Replacement / Notes |
+===========================================+==========================================================+
| SubDagOperator / SubDAGs | Use TaskGroups |
+-------------------------------------------+----------------------------------------------------------+
| SLA callbacks / metrics | Deadline Alerts (planned post-3.0) |
+-------------------------------------------+----------------------------------------------------------+
| DAG Pickling | Use JSON serialization; pickling is no longer supported |
+-------------------------------------------+----------------------------------------------------------+
| Xcom Pickling | Use custom Xcom backend; pickling is no longer supported |
+-------------------------------------------+----------------------------------------------------------+
|execution_date context var | Usedag_run.logical_date |
+-------------------------------------------+----------------------------------------------------------+
|conf anddag_run.external_trigger | Removed from context; use DAG params ordag_run APIs |
+-------------------------------------------+----------------------------------------------------------+
| CoreEmailOperator | UseEmailOperator from thesmtp provider |
+-------------------------------------------+----------------------------------------------------------+
|none_failed_or_skipped rule | Usenone_failed_min_one_success |
+-------------------------------------------+----------------------------------------------------------+
|dummy trigger rule | Usealways |
+-------------------------------------------+----------------------------------------------------------+
|fail_stop argument | Usefail_fast |
+-------------------------------------------+----------------------------------------------------------+
|store_serialized_dags=False | DAGs are always serialized; config has no effect |
+-------------------------------------------+----------------------------------------------------------+
| Deprecated core imports | Import from appropriate provider package |
+-------------------------------------------+----------------------------------------------------------+
|SequentialExecutor &DebugExecutor| Use LocalExecutor for testing |
+-------------------------------------------+----------------------------------------------------------+
|.airflowignore regex | Uses glob syntax by default |
+--------------------------------------


Configuration

📅Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Never, or you tick the rebase/retry checkbox.

🔕Ignore: Close this PR and you won't be reminded about these updates again.


  • If you want to rebase/retry this PR, check this box

This PR was generated byMend Renovate. View therepository job log.

@renovate-botrenovate-bot requested review froma team ascode ownersApril 22, 2025 17:45
@trusted-contributions-gcftrusted-contributions-gcfbot added kokoro:force-runAdd this label to force Kokoro to re-run the tests. owlbot:runAdd this label to trigger the Owlbot post processor. labelsApr 22, 2025
@product-auto-labelproduct-auto-labelbot added samplesIssues that are directly related to samples. api: composerIssues related to the Cloud Composer API. labelsApr 22, 2025
Copy link
Contributor

@gemini-code-assistgemini-code-assistbot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Hello@renovate-bot, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

Summary of Changes

This pull request updates theapache-airflow dependency to version 3.0.0 in severalrequirements.txt files and thesetup.py file for thedag_test_utils package. The PR is generated by renovate-bot, and the description includes release notes for Airflow 3.0.0, highlighting significant changes such as a service-oriented architecture, a stable DAG authoring interface, and a modernized React UI.

Highlights

  • Dependency Update: The core change is updating theapache-airflow dependency to version 3.0.0 across multiple project files.
  • Version Constraints: Thedag_test_utils/setup.py file updates the version constraint forapache-airflow[google] to be>=3.0.0, <3.1.0.
  • Release Notes: The PR description includes detailed release notes for Airflow 3.0.0, outlining major changes and migration considerations.

Changelog

Click here to see the changelog
  • composer/2022_airflow_summit/requirements.txt
    • Updatedapache-airflow from version 2.6.3 to 3.0.0.
  • composer/blog/gcp-tech-blog/data-orchestration-with-composer/requirements.txt
    • Updatedapache-airflow[google] from version 2.6.3 to 3.0.0.
  • composer/blog/gcp-tech-blog/unit-test-dags-cloud-build/requirements.txt
    • Updatedapache-airflow[gcp] from version 1.10.15 to 3.0.0.
  • composer/cicd_sample/requirements.txt
    • Updatedapache-airflow from version 2.6.3 to 3.0.0.
  • composer/dag_test_utils/setup.py
    • Updatedapache-airflow[google] version constraint from>= 2.0.0, < 3.0.0 to>=3.0.0, <3.1.0.
  • composer/workflows/requirements.txt
    • Updatedapache-airflow[amazon,apache.beam,cncf.kubernetes,google,microsoft.azure,openlineage,postgres] from version 2.9.2 to 3.0.0.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on ourdocumentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either/gemini <command> or@gemini-code-assist <command>. Below is a summary of the supported commands.

FeatureCommandDescription
Code Review/gemini reviewPerforms a code review for the current pull request in its current state.
Pull Request Summary/gemini summaryProvides a summary of the current pull request in its current state.
Comment@gemini-code-assistResponds in comments when explicitly tagged, both in issue comments and review comments.
Help/gemini helpDisplays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a.gemini/ folder in the base of the repository. Detailed instructions can be foundhere.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on@gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with theGemini Code Assist IDE Extension.


A DAG's new version,
Airflow Three takes to the sky,
Tasks now soar higher.

Footnotes

  1. Review thePrivacy Notices,Generative AI Prohibited Use Policy,Terms of Service, and learn how to configure Gemini Code Assist in GitHubhere. Gemini can make mistakes, so double check it anduse code with caution.

Copy link
Contributor

@gemini-code-assistgemini-code-assistbot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Code Review

This PR updates the dependencyapache-airflow to v3. The changes primarily involve updating the version numbers in therequirements.txt files andsetup.py. The upgrade to Airflow 3.0 is a significant one, as detailed in the release notes, and it's important to ensure compatibility and test the changes thoroughly.

Summary of Findings

  • Untested code after Airflow upgrade: The comment incomposer/blog/gcp-tech-blog/unit-test-dags-cloud-build/requirements.txt indicates that updates beyond Airflow 1.10.15 are untested with the existing code. Upgrading to 3.0.0 requires careful consideration and testing to ensure compatibility.

Merge Readiness

The PR updates the Airflow dependency to v3. While the changes themselves are straightforward, the potential impact of upgrading to a new major version requires thorough testing. Given the comment about untested code in one of the files, I recommend that the author ensures that all code is compatible with Airflow 3.0 before merging. I am unable to approve this pull request, and users should have others review and approve this code before merging.

@kokoro-teamkokoro-team removed the kokoro:force-runAdd this label to force Kokoro to re-run the tests. labelApr 22, 2025
@renovate-botrenovate-botforce-pushed therenovate/apache-airflow-3.x branch fromb12201c to1dab70cCompareMay 14, 2025 06:04
@trusted-contributions-gcftrusted-contributions-gcfbot added the kokoro:force-runAdd this label to force Kokoro to re-run the tests. labelMay 14, 2025
@kokoro-teamkokoro-team removed the kokoro:force-runAdd this label to force Kokoro to re-run the tests. labelMay 14, 2025
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Reviewers

@gemini-code-assistgemini-code-assist[bot]gemini-code-assist[bot] requested changes

At least 1 approving review is required to merge this pull request.

Assignees

@filipknapikfilipknapik

Labels
api: composerIssues related to the Cloud Composer API.owlbot:runAdd this label to trigger the Owlbot post processor.samplesIssues that are directly related to samples.
Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

3 participants
@renovate-bot@kokoro-team@filipknapik

[8]ページ先頭

©2009-2025 Movatter.jp