Movatterモバイル変換


[0]ホーム

URL:


Airflow Summit 2025 is coming October 07-09. Register now for early bird ticket!

Content

References

Internal DB details

Content

References

Internal DB details

Command Line Interface and Environment Variables Reference

Command Line Interface

Airflow has a very rich command line interface that allows formany types of operation on a DAG, starting services, and supportingdevelopment and testing.

Note

For more information on usage CLI, seeUsing the Command Line Interface

Providers that implement executors might contribute additional commands to the CLI. Here are the commandscontributed by the community providers:

Usage:airflow[-h]GROUP_OR_COMMAND...

Positional Arguments

GROUP_OR_COMMAND

Possible choices: api-server, assets, backfill, cheat-sheet, config, connections, dag-processor, dags, db, info, jobs, kerberos, plugins, pools, providers, rotate-fernet-key, scheduler, standalone, tasks, triggerer, variables, version

Sub-commands

api-server

Start an Airflow API server instance

airflowapi-server[-h][-AACCESS_LOGFILE][--appsAPPS][-D][-d][-HHOST][-lLOG_FILE][--pid[PID]][-pPORT][--proxy-headers][--ssl-certSSL_CERT][--ssl-keySSL_KEY][--stderrSTDERR][--stdoutSTDOUT][-tWORKER_TIMEOUT][-wWORKERS]
Named Arguments
-A, --access-logfile

The logfile to store the access log. Use ‘-’ to print to stdout

Default: “-”

--apps

Applications to run (comma-separated). Default is all. Options: core, execution, all

Default: “all”

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-d, --dev

Start FastAPI in development mode

Default: False

-H, --host

Set the host on which to run the API server

Default: “0.0.0.0”

-l, --log-file

Location of the log file

--pid

PID file location

-p, --port

The port on which to run the API server

Default: 8080

--proxy-headers

Enable X-Forwarded-Proto, X-Forwarded-For, X-Forwarded-Port to populate remote address info.

Default: False

--ssl-cert

Path to the SSL certificate for the webserver

Default: “”

--ssl-key

Path to the key to use with the SSL certificate

Default: “”

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-t, --worker-timeout

The timeout for waiting on API server workers

Default: 120

-w, --workers

Number of workers to run on the API server

Default: 4

assets

Manage assets

airflowassets[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: details, list, materialize

Sub-commands
details

Show asset details

airflowassetsdetails[-h][--alias][--nameNAME][-otable,json,yaml,plain][--uriURI][-v]
Named Arguments
--alias

Show asset alias

Default: False

--name

Asset name

Default: “”

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

--uri

Asset URI

Default: “”

-v, --verbose

Make logging output more verbose

Default: False

list

List assets

airflowassetslist[-h][--alias][--columnsCOLUMNS][-otable,json,yaml,plain][-v]
Named Arguments
--alias

Show asset alias

Default: False

--columns

List of columns to render. (default: [‘name’, ‘uri’, ‘group’, ‘extra’])

Default: (‘name’, ‘uri’, ‘group’, ‘extra’)

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

materialize

Materialize an asset

airflowassetsmaterialize[-h][--nameNAME][-otable,json,yaml,plain][--uriURI][-v]
Named Arguments
--name

Asset name

Default: “”

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

--uri

Asset URI

Default: “”

-v, --verbose

Make logging output more verbose

Default: False

backfill

Manage backfills

airflowbackfill[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: create

Sub-commands
create

Run subsections of a DAG for a specified date range.

airflowbackfillcreate[-h]--dag-idDAG_ID[--dag-run-confDAG_RUN_CONF][--dry-run]--from-dateFROM_DATE[--max-active-runsMAX_ACTIVE_RUNS][--reprocess-behavior{none,completed,failed}][--run-backwards]--to-dateTO_DATE
Named Arguments
--dag-id

The dag to backfill.

--dag-run-conf

JSON dag run configuration.

--dry-run

Perform a dry run

Default: False

--from-date

Earliest logical date to backfill.

--max-active-runs

Max active runs for this backfill.

--reprocess-behavior

Possible choices: none, completed, failed

When a run exists for the logical date, controls whether new runs will be created for the date. Default is none.

--run-backwards

If set, the backfill will run tasks from the most recent logical date first. Not supported if there are tasks that depend_on_past.

Default: False

--to-date

Latest logical date to backfill

cheat-sheet

Display cheat sheet

airflowcheat-sheet[-h][-v]
Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

config

View configuration

airflowconfig[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: get-value, lint, list, update

Sub-commands
get-value

Print the value of the configuration

airflowconfigget-value[-h][-v]sectionoption
Positional Arguments
section

The section name

option

The option name

Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

lint

lint options for the configuration changes while migrating from Airflow 2.x to Airflow 3.0

airflowconfiglint[-h][--ignore-optionIGNORE_OPTION][--ignore-sectionIGNORE_SECTION][--optionOPTION][--sectionSECTION][-v]
Named Arguments
--ignore-option

The option name(s) to ignore to lint in the airflow config.

--ignore-section

The section name(s) to ignore to lint in the airflow config.

--option

The option name(s) to lint in the airflow config.

--section

The section name(s) to lint in the airflow config.

-v, --verbose

Make logging output more verbose

Default: False

list

List options for the configuration

airflowconfiglist[-h][--color{auto,on,off}][-c][-a][-p][-d][-V][-e][-s][--sectionSECTION][-v]
Named Arguments
--color

Possible choices: auto, on, off

Do emit colored output (default: auto)

Default: “auto”

-c, --comment-out-everything

Comment out all configuration options. Useful as starting point for new installation

Default: False

-a, --defaults

Show only defaults - do not include local configuration, sources, includes descriptions, examples, variables. Comment out everything.

Default: False

-p, --exclude-providers

Exclude provider configuration (they are included by default)

Default: False

-d, --include-descriptions

Show descriptions for the configuration variables

Default: False

-V, --include-env-vars

Show environment variable for each option

Default: False

-e, --include-examples

Show examples for the configuration variables

Default: False

-s, --include-sources

Show source of the configuration variable

Default: False

--section

The section name

-v, --verbose

Make logging output more verbose

Default: False

update

update options for the configuration changes while migrating from Airflow 2.x to Airflow 3.0

airflowconfigupdate[-h][--all-recommendations][--fix][--ignore-optionIGNORE_OPTION][--ignore-sectionIGNORE_SECTION][--optionOPTION][--sectionSECTION][-v]
Named Arguments
--all-recommendations

Include non-breaking (recommended) changes along with breaking ones. (Also use with –fix)

Default: False

--fix

Automatically apply the configuration changes instead of performing a dry run. (Default: dry-run mode)

Default: False

--ignore-option

The option name(s) to ignore to update in the airflow config.

--ignore-section

The section name(s) to ignore to update in the airflow config.

--option

The option name(s) to update in the airflow config.

--section

The section name(s) to update in the airflow config.

-v, --verbose

Make logging output more verbose

Default: False

connections

Manage connections

airflowconnections[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: add, create-default-connections, delete, export, get, import, list, test

Sub-commands
add

Add a connection

airflowconnectionsadd[-h][--conn-descriptionCONN_DESCRIPTION][--conn-extraCONN_EXTRA][--conn-hostCONN_HOST][--conn-jsonCONN_JSON][--conn-loginCONN_LOGIN][--conn-passwordCONN_PASSWORD][--conn-portCONN_PORT][--conn-schemaCONN_SCHEMA][--conn-typeCONN_TYPE][--conn-uriCONN_URI]conn_id
Positional Arguments
conn_id

Connection id, required to get/add/delete/test a connection

Named Arguments
--conn-description

Connection description, optional when adding a connection

--conn-extra

ConnectionExtra field, optional when adding a connection

--conn-host

Connection host, optional when adding a connection

--conn-json

Connection JSON, required to add a connection using JSON representation

--conn-login

Connection login, optional when adding a connection

--conn-password

Connection password, optional when adding a connection

--conn-port

Connection port, optional when adding a connection

--conn-schema

Connection schema, optional when adding a connection

--conn-type

Connection type, required to add a connection without conn_uri

--conn-uri

Connection URI, required to add a connection without conn_type

create-default-connections

Creates all the default connections from all the providers

airflowconnectionscreate-default-connections[-h][-v]
Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

delete

Delete a connection

airflowconnectionsdelete[-h][--color{auto,on,off}][-v]conn_id
Positional Arguments
conn_id

Connection id, required to get/add/delete/test a connection

Named Arguments
--color

Possible choices: auto, on, off

Do emit colored output (default: auto)

Default: “auto”

-v, --verbose

Make logging output more verbose

Default: False

export

All connections can be exported in STDOUT using the following command:airflow connections export -The file format can be determined by the provided file extension. E.g., The following command will export the connections in JSON format:airflow connections export /tmp/connections.jsonThe –file-format parameter can be used to control the file format. E.g., the default format is JSON in STDOUT mode, which can be overridden using:airflow connections export - –file-format yamlThe –file-format parameter can also be used for the files, for example:airflow connections export /tmp/connections –file-format json.When exporting inenv file format, you control whether URI format or JSON format is used to serialize the connection by passinguri orjson with option–serialization-format.

airflowconnectionsexport[-h][--file-format{json,yaml,env}][--format{json,yaml,env}][--serialization-format{json,uri}][-v]file
Positional Arguments
file

Output file path for exporting the connections

Named Arguments
--file-format

Possible choices: json, yaml, env

File format for the export

--format

Possible choices: json, yaml, env

Deprecated – use–file-format instead. File format to use for the export.

--serialization-format

Possible choices: json, uri

When exporting as.env format, defines how connections should be serialized. Default isuri.

-v, --verbose

Make logging output more verbose

Default: False

get

Get a connection

airflowconnectionsget[-h][--color{auto,on,off}][-otable,json,yaml,plain][-v]conn_id
Positional Arguments
conn_id

Connection id, required to get/add/delete/test a connection

Named Arguments
--color

Possible choices: auto, on, off

Do emit colored output (default: auto)

Default: “auto”

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

import

Connections can be imported from the output of the export command.The filetype must by json, yaml or env and will be automatically inferred.

airflowconnectionsimport[-h][--overwrite][-v]file
Positional Arguments
file

Import connections from a file

Named Arguments
--overwrite

Overwrite existing entries if a conflict occurs

Default: False

-v, --verbose

Make logging output more verbose

Default: False

list

List connections

airflowconnectionslist[-h][--conn-idCONN_ID][-otable,json,yaml,plain][-v]
Named Arguments
--conn-id

If passed, only items with the specified connection ID will be displayed

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

test

Test a connection

airflowconnectionstest[-h][-v]conn_id
Positional Arguments
conn_id

Connection id, required to get/add/delete/test a connection

Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

dag-processor

Start a dag processor instance

airflowdag-processor[-h][-BBUNDLE_NAME][-D][-lLOG_FILE][-nNUM_RUNS][--pid[PID]][--stderrSTDERR][--stdoutSTDOUT][-v]
Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-l, --log-file

Location of the log file

-n, --num-runs

Set the number of runs to execute before exiting

Default: -1

--pid

PID file location

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-v, --verbose

Make logging output more verbose

Default: False

dags

Manage DAGs

airflowdags[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: delete, details, list, list-import-errors, list-jobs, list-runs, next-execution, pause, report, reserialize, show, show-dependencies, state, test, trigger, unpause

Sub-commands
delete

Delete all DB records related to the specified DAG

airflowdagsdelete[-h][-v][-y]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

details

Get DAG details given a DAG id

airflowdagsdetails[-h][-otable,json,yaml,plain][-v]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

list

List all the DAGs

airflowdagslist[-h][-BBUNDLE_NAME][--columnsCOLUMNS][-l][-otable,json,yaml,plain][-v]
Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

--columns

List of columns to render. (default: [‘dag_id’, ‘fileloc’, ‘owner’, ‘is_paused’])

Default: (‘dag_id’, ‘fileloc’, ‘owners’, ‘is_paused’, ‘bundle_name’, ‘bundle_version’)

-l, --local

Shows local parsed DAGs and their import errors, ignores content serialized in DB

Default: False

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

list-import-errors

List all the DAGs that have import errors

airflowdagslist-import-errors[-h][-BBUNDLE_NAME][-l][-otable,json,yaml,plain][-v]
Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

-l, --local

Shows local parsed DAGs and their import errors, ignores content serialized in DB

Default: False

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

list-jobs

List the jobs

airflowdagslist-jobs[-h][-dDAG_ID][--limitLIMIT][-otable,json,yaml,plain][--staterunning,success,restarting,failed][-v]
Named Arguments
-d, --dag-id

The id of the dag

--limit

Return a limited number of records

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

--state

Possible choices: running, success, restarting, failed

Only list the jobs corresponding to the state

-v, --verbose

Make logging output more verbose

Default: False

list-runs

List DAG runs given a DAG id. If state option is given, it will only search for all the dagruns with the given state. If no_backfill option is given, it will filter out all backfill dagruns for given dag id. If start_date is given, it will filter out all the dagruns that were executed before this date. If end_date is given, it will filter out all the dagruns that were executed after this date.

airflowdagslist-runs[-h][-eEND_DATE][--no-backfill][-otable,json,yaml,plain][-sSTART_DATE][--statequeued,running,success,failed][-v]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-e, --end-date

Override end_date YYYY-MM-DD

--no-backfill

filter all the backfill dagruns given the dag id

Default: False

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-s, --start-date

Override start_date YYYY-MM-DD

--state

Possible choices: queued, running, success, failed

Only list the DAG runs corresponding to the state

-v, --verbose

Make logging output more verbose

Default: False

next-execution

Get the next logical datetimes of a DAG. It returns one execution unless the num-executions option is given

airflowdagsnext-execution[-h][-nNUM_EXECUTIONS][-v]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-n, --num-executions

The number of next logical date times to show

Default: 1

-v, --verbose

Make logging output more verbose

Default: False

pause

Pause one or more DAGs. This command allows to halt the execution of specified DAGs, disabling further task scheduling. Use–treat-dag-id-as-regex to target multiple DAGs by treating the–dag-id as a regex pattern.

airflowdagspause[-h][-otable,json,yaml,plain][--treat-dag-id-as-regex][-v][-y]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

--treat-dag-id-as-regex

if set, dag_id will be treated as regex instead of an exact string

Default: False

-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

report

Show DagBag loading report

airflowdagsreport[-h][-BBUNDLE_NAME][-otable,json,yaml,plain][-v]
Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

reserialize

Reserialize DAGs in the metadata DB. This can be particularly useful if your serialized DAGs become out of sync with the Airflow version you are using.

airflowdagsreserialize[-h][-BBUNDLE_NAME][-v]
Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

-v, --verbose

Make logging output more verbose

Default: False

show

The –imgcat option only works in iTerm.

For more information, see:https://www.iterm2.com/documentation-images.html

The –save option saves the result to the indicated file.

The file format is determined by the file extension. For more information about supported format, see:https://www.graphviz.org/doc/info/output.html

If you want to create a PNG file then you should execute the following command:airflow dags show <DAG_ID> –save output.png

If you want to create a DOT file then you should execute the following command:airflow dags show <DAG_ID> –save output.dot

airflowdagsshow[-h][--imgcat][-sSAVE][-v]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
--imgcat

Displays graph using the imgcat tool.

Default: False

-s, --save

Saves the result to the indicated file.

-v, --verbose

Make logging output more verbose

Default: False

show-dependencies

The –imgcat option only works in iTerm.

For more information, see:https://www.iterm2.com/documentation-images.html

The –save option saves the result to the indicated file.

The file format is determined by the file extension. For more information about supported format, see:https://www.graphviz.org/doc/info/output.html

If you want to create a PNG file then you should execute the following command:airflow dags show-dependencies –save output.png

If you want to create a DOT file then you should execute the following command:airflow dags show-dependencies –save output.dot

airflowdagsshow-dependencies[-h][--imgcat][-sSAVE][-v]
Named Arguments
--imgcat

Displays graph using the imgcat tool.

Default: False

-s, --save

Saves the result to the indicated file.

-v, --verbose

Make logging output more verbose

Default: False

state

Get the status of a dag run

airflowdagsstate[-h][-v]dag_idlogical_date_or_run_id
Positional Arguments
dag_id

The id of the dag

logical_date_or_run_id

The logical date of the DAG or run_id of the DAGRun

Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

test

Execute one single DagRun for a given DAG and logical date.

You can test a DAG in three ways:1. Using default bundle:

System Message: ERROR/3 (, line 5)

Unexpected indentation.

airflow dags test <DAG_ID>

  1. Using a specific bundle if multiple DAG bundles are configured:airflow dags test <DAG_ID> –bundle-name <BUNDLE_NAME> (or -B <BUNDLE_NAME>)

  2. Using a specific DAG file:airflow dags test <DAG_ID> –dagfile-path <PATH> (or -f <PATH>)

The –imgcat-dagrun option only works in iTerm.

For more information, see:https://www.iterm2.com/documentation-images.html

If –save-dagrun is used, then, after completing the backfill, saves the diagram for current DAG Run to the indicated file.The file format is determined by the file extension. For more information about supported format, see:https://www.graphviz.org/doc/info/output.html

If you want to create a PNG file then you should execute the following command:airflow dags test <DAG_ID> <LOGICAL_DATE> –save-dagrun output.png

If you want to create a DOT file then you should execute the following command:airflow dags test <DAG_ID> <LOGICAL_DATE> –save-dagrun output.dot

airflowdagstest[-h][-BBUNDLE_NAME][-cCONF][-fDAGFILE_PATH][--imgcat-dagrun][--mark-success-patternMARK_SUCCESS_PATTERN][--save-dagrunSAVE_DAGRUN][--show-dagrun][--use-executor][-v]dag_id[logical_date]
Positional Arguments
dag_id

The id of the dag

logical_date

The logical date of the DAG (optional)

Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

-c, --conf

JSON string that gets pickled into the DagRun’s conf attribute

-f, --dagfile-path

Path to the dag file. Can be absolute or relative to current directory

--imgcat-dagrun

After completing the dag run, prints a diagram on the screen for the current DAG Run using the imgcat tool.

Default: False

--mark-success-pattern

Don’t run task_ids matching the regex <MARK_SUCCESS_PATTERN>, mark them as successful instead.Can be used to skip e.g. dependency check sensors or cleanup steps in local testing.

--save-dagrun

After completing the backfill, saves the diagram for current DAG Run to the indicated file.

--show-dagrun

After completing the backfill, shows the diagram for current DAG Run.

The diagram is in DOT language

Default: False

--use-executor

Use an executor to test the DAG. By default it runs the DAG without an executor. If set, it uses the executor configured in the environment.

Default: False

-v, --verbose

Make logging output more verbose

Default: False

trigger

Trigger a new DAG run. If DAG is paused then dagrun state will remain queued, and the task won’t run.

airflowdagstrigger[-h][-cCONF][-lLOGICAL_DATE][--no-replace-microseconds][-otable,json,yaml,plain][-rRUN_ID][-v]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-c, --conf

JSON string that gets pickled into the DagRun’s conf attribute

-l, --logical-date

The logical date of the DAG

--no-replace-microseconds

whether microseconds should be zeroed

Default: True

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-r, --run-id

Helps to identify this run

-v, --verbose

Make logging output more verbose

Default: False

unpause

Resume one or more DAGs. This command allows to restore the execution of specified DAGs, enabling further task scheduling. Use–treat-dag-id-as-regex to target multiple DAGs treating the–dag-id as a regex pattern.

airflowdagsunpause[-h][-otable,json,yaml,plain][--treat-dag-id-as-regex][-v][-y]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

--treat-dag-id-as-regex

if set, dag_id will be treated as regex instead of an exact string

Default: False

-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

db

Database operations

airflowdb[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: check, check-migrations, clean, downgrade, drop-archived, export-archived, migrate, reset, shell

Sub-commands
check

Check if the database can be reached

airflowdbcheck[-h][--retryRETRY][--retry-delayRETRY_DELAY][-v]
Named Arguments
--retry

Retry database check upon failure

Default: 0

--retry-delay

Wait time between retries in seconds

Default: 1

-v, --verbose

Make logging output more verbose

Default: False

check-migrations

Check if migration have finished (or continually check until timeout)

airflowdbcheck-migrations[-h][-tMIGRATION_WAIT_TIMEOUT][-v]
Named Arguments
-t, --migration-wait-timeout

timeout to wait for db to migrate

Default: 60

-v, --verbose

Make logging output more verbose

Default: False

clean

Purge old records in metastore tables

airflowdbclean[-h]--clean-before-timestampCLEAN_BEFORE_TIMESTAMP[--dry-run][--skip-archive][-tTABLES][-v][-y]
Named Arguments
--clean-before-timestamp

The date or timestamp before which data should be purged.If no timezone info is supplied then dates are assumed to be in airflow default timezone.Example: ‘2022-01-01 00:00:00+01:00’

--dry-run

Perform a dry run

Default: False

--skip-archive

Don’t preserve purged records in an archive table.

Default: False

-t, --tables

Table names to perform maintenance on (use comma-separated list).Options: [‘_xcom_archive’, ‘asset_event’, ‘callback_request’, ‘celery_taskmeta’, ‘celery_tasksetmeta’, ‘dag’, ‘dag_run’, ‘dag_version’, ‘deadline’, ‘import_error’, ‘job’, ‘log’, ‘sla_miss’, ‘task_instance’, ‘task_instance_history’, ‘task_reschedule’, ‘trigger’, ‘xcom’]

-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

downgrade

Downgrade the schema of the metadata database. You must provide either–to-revision or–to-version. To print but not execute commands, use option–show-sql-only. If using options–from-revision or–from-version, you must also use–show-sql-only, because if actuallyrunning migrations, we should only migrate from thecurrent Alembic revision.

airflowdbdowngrade[-h][--from-revisionFROM_REVISION][--from-versionFROM_VERSION][-s][-rTO_REVISION][-nTO_VERSION][-v][-y]
Named Arguments
--from-revision

(Optional) If generating sql, may supply afrom Alembic revision

--from-version

(Optional) If generating sql, may supply afrom version

-s, --show-sql-only

Don’t actually run migrations; just print out sql scripts for offline migration. Required if using either–from-revision or–from-version.

Default: False

-r, --to-revision

The Alembic revision to downgrade to. Note: must provide either–to-revision or–to-version.

-n, --to-version

(Optional) If provided, only run migrations up to this version.

-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

drop-archived

Drop archived tables created through the db clean command

airflowdbdrop-archived[-h][-tTABLES][-y]
Named Arguments
-t, --tables

Table names to perform maintenance on (use comma-separated list).Options: [‘_xcom_archive’, ‘asset_event’, ‘callback_request’, ‘celery_taskmeta’, ‘celery_tasksetmeta’, ‘dag’, ‘dag_run’, ‘dag_version’, ‘deadline’, ‘import_error’, ‘job’, ‘log’, ‘sla_miss’, ‘task_instance’, ‘task_instance_history’, ‘task_reschedule’, ‘trigger’, ‘xcom’]

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

export-archived

Export archived data from the archive tables

airflowdbexport-archived[-h][--drop-archives][--export-format{csv}]--output-pathDIRPATH[-tTABLES][-y]
Named Arguments
--drop-archives

Drop the archive tables after exporting. Use with caution.

Default: False

--export-format

Possible choices: csv

The file format to export the cleaned data

Default: “csv”

--output-path

The path to the output directory to export the cleaned data. This directory must exist.

-t, --tables

Table names to perform maintenance on (use comma-separated list).Options: [‘_xcom_archive’, ‘asset_event’, ‘callback_request’, ‘celery_taskmeta’, ‘celery_tasksetmeta’, ‘dag’, ‘dag_run’, ‘dag_version’, ‘deadline’, ‘import_error’, ‘job’, ‘log’, ‘sla_miss’, ‘task_instance’, ‘task_instance_history’, ‘task_reschedule’, ‘trigger’, ‘xcom’]

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

migrate

Migrate the schema of the metadata database. Create the database if it does not exist To print but not execute commands, use option--show-sql-only. If using options--from-revision or--from-version, you must also use--show-sql-only, because if actuallyrunning migrations, we should only migrate from thecurrent Alembic revision.

airflowdbmigrate[-h][--from-revisionFROM_REVISION][--from-versionFROM_VERSION][-s][-rTO_REVISION][-nTO_VERSION][-v]
Named Arguments
--from-revision

(Optional) If generating sql, may supply afrom Alembic revision

--from-version

(Optional) If generating sql, may supply afrom version

-s, --show-sql-only

Don’t actually run migrations; just print out sql scripts for offline migration. Required if using either–from-revision or–from-version.

Default: False

-r, --to-revision

(Optional) If provided, only run migrations up to and including this Alembic revision.

-n, --to-version

(Optional) The airflow version to upgrade to. Note: must provide either–to-revision or–to-version.

-v, --verbose

Make logging output more verbose

Default: False

reset

Burn down and rebuild the metadata database

airflowdbreset[-h][-s][-v][-y]
Named Arguments
-s, --skip-init

Only remove tables; do not perform db init.

Default: False

-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

shell

Runs a shell to access the database

airflowdbshell[-h][-v]
Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

info

Show information about current Airflow and environment

airflowinfo[-h][--anonymize][--file-io][-otable,json,yaml,plain][-v]
Named Arguments
--anonymize

Minimize any personal identifiable information. Use it when sharing output with others.

Default: False

--file-io

Send output to file.io service and returns link.

Default: False

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

jobs

Manage jobs

airflowjobs[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: check

Sub-commands
check

Checks if job(s) are still alive

airflowjobscheck[-h][--allow-multiple][--hostnameHOSTNAME][--job-type{SchedulerJob,TriggererJob,DagProcessorJob}][--limitLIMIT][--local][-v]
Named Arguments
--allow-multiple

If passed, this command will be successful even if multiple matching alive jobs are found.

Default: False

--hostname

The hostname of job(s) that will be checked.

--job-type

Possible choices: SchedulerJob, TriggererJob, DagProcessorJob

The type of job(s) that will be checked.

--limit

The number of recent jobs that will be checked. To disable limit, set 0.

Default: 1

--local

If passed, this command will only show jobs from the local host (those with a hostname matching whathostname_callable returns).

Default: False

-v, --verbose

Make logging output more verbose

Default: False

examples:To check if the local scheduler is still working properly, run:

$ airflow jobs check –job-type SchedulerJob –local”

To check if any scheduler is running when you are using high availability, run:

$ airflow jobs check –job-type SchedulerJob –allow-multiple –limit 100

kerberos

Start a kerberos ticket renewer

airflowkerberos[-h][-D][-k[KEYTAB]][-lLOG_FILE][-o][--pid[PID]][--stderrSTDERR][--stdoutSTDOUT][-v][principal]
Positional Arguments
principal

kerberos principal

Named Arguments
-D, --daemon

Daemonize instead of running in the foreground

Default: False

-k, --keytab

keytab

Default: “airflow.keytab”

-l, --log-file

Location of the log file

-o, --one-time

Run airflow kerberos one time instead of forever

Default: False

--pid

PID file location

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-v, --verbose

Make logging output more verbose

Default: False

plugins

Dump information about loaded plugins

airflowplugins[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

pools

Manage pools

airflowpools[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: delete, export, get, import, list, set

Sub-commands
delete

Delete pool

airflowpoolsdelete[-h][-otable,json,yaml,plain][-v]NAME
Positional Arguments
NAME

Pool name

Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

export

Export all pools

airflowpoolsexport[-h][-v]FILEPATH
Positional Arguments
FILEPATH

Export all pools to JSON file

Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

get

Get pool size

airflowpoolsget[-h][-otable,json,yaml,plain][-v]NAME
Positional Arguments
NAME

Pool name

Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

import

Import pools

airflowpoolsimport[-h][-v]FILEPATH
Positional Arguments
FILEPATH

Import pools from JSON file. Example format:

{"pool_1":{"slots":5,"description":"","include_deferred":true},"pool_2":{"slots":10,"description":"test","include_deferred":false}}
Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

list

List pools

airflowpoolslist[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

set

Configure pool

airflowpoolsset[-h][--include-deferred][-otable,json,yaml,plain][-v]NAMEslotsdescription
Positional Arguments
NAME

Pool name

slots

Pool slots

description

Pool description

Named Arguments
--include-deferred

Include deferred tasks in calculations for Pool

Default: False

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

providers

Display providers

airflowproviders[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: auth-managers, behaviours, configs, executors, get, hooks, lazy-loaded, links, list, logging, notifications, queues, secrets, triggers, widgets

Sub-commands
auth-managers

Get information about auth managers provided

airflowprovidersauth-managers[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

behaviours

Get information about registered connection types with custom behaviours

airflowprovidersbehaviours[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

configs

Get information about provider configuration

airflowprovidersconfigs[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

executors

Get information about executors provided

airflowprovidersexecutors[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

get

Get detailed information about a provider

airflowprovidersget[-h][--color{auto,on,off}][-f][-otable,json,yaml,plain][-v]provider_name
Positional Arguments
provider_name

Provider name, required to get provider information

Named Arguments
--color

Possible choices: auto, on, off

Do emit colored output (default: auto)

Default: “auto”

-f, --full

Full information about the provider, including documentation information.

Default: False

-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

hooks

List registered provider hooks

airflowprovidershooks[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

lazy-loaded

Checks that provider configuration is lazy loaded

airflowproviderslazy-loaded[-h][-v]
Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

links

List extra links registered by the providers

airflowproviderslinks[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

list

List installed providers

airflowproviderslist[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

logging

Get information about task logging handlers provided

airflowproviderslogging[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

notifications

Get information about notifications provided

airflowprovidersnotifications[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

queues

Get information about queues provided

airflowprovidersqueues[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

secrets

Get information about secrets backends provided

airflowproviderssecrets[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

triggers

List registered provider triggers

airflowproviderstriggers[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

widgets

Get information about registered connection form widgets

airflowproviderswidgets[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

rotate-fernet-key

Rotate all encrypted connection credentials and variables; seehttps://airflow.apache.org/docs/apache-airflow/stable/howto/secure-connections.html#rotating-encryption-keys

airflowrotate-fernet-key[-h]

scheduler

Start a scheduler instance

airflowscheduler[-h][-D][-lLOG_FILE][-nNUM_RUNS][--pid[PID]][-s][--stderrSTDERR][--stdoutSTDOUT][-v]
Named Arguments
-D, --daemon

Daemonize instead of running in the foreground

Default: False

-l, --log-file

Location of the log file

-n, --num-runs

Set the number of runs to execute before exiting

Default: -1

--pid

PID file location

-s, --skip-serve-logs

Don’t start the serve logs process along with the workers

Default: False

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-v, --verbose

Make logging output more verbose

Default: False

Signals:

  • SIGUSR2: Dump a snapshot of task state being tracked by the executor.

    Example:

    pkill -f -USR2 “airflow scheduler”

standalone

Run an all-in-one copy of Airflow

airflowstandalone[-h]

tasks

Manage tasks

airflowtasks[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: clear, failed-deps, list, render, state, states-for-dag-run, test

Sub-commands
clear

Clear a set of task instance, as if they never ran

airflowtasksclear[-h][-BBUNDLE_NAME][-R][-d][-eEND_DATE][-f][-r][-sSTART_DATE][-tTASK_REGEX][-u][-v][-y]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

-R, --dag-regex

Search dag_id as regex instead of exact string

Default: False

-d, --downstream

Include downstream tasks

Default: False

-e, --end-date

Override end_date YYYY-MM-DD

-f, --only-failed

Only failed jobs

Default: False

-r, --only-running

Only running jobs

Default: False

-s, --start-date

Override start_date YYYY-MM-DD

-t, --task-regex

The regex to filter specific task_ids (optional)

-u, --upstream

Include upstream tasks

Default: False

-v, --verbose

Make logging output more verbose

Default: False

-y, --yes

Do not prompt to confirm. Use with care!

Default: False

failed-deps

Returns the unmet dependencies for a task instance from the perspective of the scheduler. In other words, why a task instance doesn’t get scheduled and then queued by the scheduler, and then run by an executor.

airflowtasksfailed-deps[-h][-BBUNDLE_NAME][--map-indexMAP_INDEX][-v]dag_idtask_idlogical_date_or_run_id
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

logical_date_or_run_id

The logical date of the DAG or run_id of the DAGRun

Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

--map-index

Mapped task index

Default: -1

-v, --verbose

Make logging output more verbose

Default: False

list

List the tasks within a DAG

airflowtaskslist[-h][-BBUNDLE_NAME][-v]dag_id
Positional Arguments
dag_id

The id of the dag

Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

-v, --verbose

Make logging output more verbose

Default: False

render

Render a task instance’s template(s)

airflowtasksrender[-h][-BBUNDLE_NAME][--map-indexMAP_INDEX][-v]dag_idtask_idlogical_date_or_run_id
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

logical_date_or_run_id

The logical date of the DAG or run_id of the DAGRun

Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

--map-index

Mapped task index

Default: -1

-v, --verbose

Make logging output more verbose

Default: False

state

Get the status of a task instance

airflowtasksstate[-h][-BBUNDLE_NAME][--map-indexMAP_INDEX][-v]dag_idtask_idlogical_date_or_run_id
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

logical_date_or_run_id

The logical date of the DAG or run_id of the DAGRun

Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

--map-index

Mapped task index

Default: -1

-v, --verbose

Make logging output more verbose

Default: False

states-for-dag-run

Get the status of all task instances in a dag run

airflowtasksstates-for-dag-run[-h][-otable,json,yaml,plain][-v]dag_idlogical_date_or_run_id
Positional Arguments
dag_id

The id of the dag

logical_date_or_run_id

The logical date of the DAG or run_id of the DAGRun

Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

test

Test a task instance. This will run a task without checking for dependencies or recording its state in the database

airflowtaskstest[-h][-BBUNDLE_NAME][--env-varsENV_VARS][--map-indexMAP_INDEX][-m][-tTASK_PARAMS][-v]dag_idtask_id[logical_date_or_run_id]
Positional Arguments
dag_id

The id of the dag

task_id

The id of the task

logical_date_or_run_id

The logical date of the DAG or run_id of the DAGRun (optional)

Named Arguments
-B, --bundle-name

The name of the DAG bundle to use; may be provided more than once

--env-vars

Set env var in both parsing time and runtime for each of entry supplied in a JSON dict

--map-index

Mapped task index

Default: -1

-m, --post-mortem

Open debugger on uncaught exception

Default: False

-t, --task-params

Sends a JSON params dict to the task

-v, --verbose

Make logging output more verbose

Default: False

triggerer

Start a triggerer instance

airflowtriggerer[-h][--capacityCAPACITY][-D][-lLOG_FILE][--pid[PID]][-s][--stderrSTDERR][--stdoutSTDOUT][-v]
Named Arguments
--capacity

The maximum number of triggers that a Triggerer will run at one time.

-D, --daemon

Daemonize instead of running in the foreground

Default: False

-l, --log-file

Location of the log file

--pid

PID file location

-s, --skip-serve-logs

Don’t start the serve logs process along with the workers

Default: False

--stderr

Redirect stderr to this file

--stdout

Redirect stdout to this file

-v, --verbose

Make logging output more verbose

Default: False

variables

Manage variables

airflowvariables[-h]COMMAND...
Positional Arguments
COMMAND

Possible choices: delete, export, get, import, list, set

Sub-commands
delete

Delete variable

airflowvariablesdelete[-h][-v]key
Positional Arguments
key

Variable key

Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

export

All variables can be exported in STDOUT using the following command:airflow variables export -

airflowvariablesexport[-h][-v]file
Positional Arguments
file

Export all variables to JSON file

Named Arguments
-v, --verbose

Make logging output more verbose

Default: False

get

Get variable

airflowvariablesget[-h][-dVAL][-j][-v]key
Positional Arguments
key

Variable key

Named Arguments
-d, --default

Default value returned if variable does not exist

-j, --json

Deserialize JSON variable

Default: False

-v, --verbose

Make logging output more verbose

Default: False

import

Import variables

airflowvariablesimport[-h][-a{overwrite,fail,skip}][-v]file
Positional Arguments
file

Import variables from JSON file

Named Arguments
-a, --action-on-existing-key

Possible choices: overwrite, fail, skip

Action to take if we encounter a variable key that already exists.

Default: “overwrite”

-v, --verbose

Make logging output more verbose

Default: False

list

List variables

airflowvariableslist[-h][-otable,json,yaml,plain][-v]
Named Arguments
-o, --output

Possible choices: table, json, yaml, plain

Output format. Allowed values: json, yaml, plain, table (default: table)

Default: “table”

-v, --verbose

Make logging output more verbose

Default: False

set

Set variable

airflowvariablesset[-h][--descriptionDESCRIPTION][-j][-v]keyVALUE
Positional Arguments
key

Variable key

VALUE

Variable value

Named Arguments
--description

Variable description, optional when setting a variable

-j, --json

Serialize JSON variable

Default: False

-v, --verbose

Make logging output more verbose

Default: False

version

Show the version

airflowversion[-h]

Environment Variables

AIRFLOW__{SECTION}__{KEY}

Sets options in the Airflow configuration. This takes priority over the value in theairflow.cfg file.

Replace the{SECTION} placeholder with any sectionand the{KEY} placeholder with any key in that specified section.

For example, if you want to set thedags_folder options in[core] section,then you should set theAIRFLOW__CORE__DAGS_FOLDER environment variable.

For more information, see:Setting Configuration Options.

AIRFLOW__{SECTION}__{KEY}_CMD

For any specific key in a section in Airflow, execute the command the key is pointing to.The result of the command is used as a value of theAIRFLOW__{SECTION}__{KEY} environment variable.

This is only supported by the following config options:

  • sql_alchemy_conn in[database] section

  • fernet_key in[core] section

  • broker_url in[celery] section

  • flower_basic_auth in[celery] section

  • result_backend in[celery] section

  • password in[atlas] section

  • smtp_password in[smtp] section

  • secret_key in[api] section

AIRFLOW__{SECTION}__{KEY}_SECRET

For any specific key in a section in Airflow, retrieve the secret from the configured secrets backend.The returned value will be used as the value of theAIRFLOW__{SECTION}__{KEY} environment variable.

SeeSecrets Backends for more information on available secrets backends.

This form of environment variable configuration is only supported for the same subset of config options asAIRFLOW__{SECTION}__{KEY}_CMD

AIRFLOW_CONFIG

The path to the Airflow configuration file.

AIRFLOW_CONN_{CONN_ID}

Defines a new connection with the name{CONN_ID} using the URI value.

For example, if you want to create a connection namedPROXY_POSTGRES_TCP, you can createa keyAIRFLOW_CONN_PROXY_POSTGRES_TCP with the connection URI as the value.

For more information, see:Storing connections in environment variables.

AIRFLOW_HOME

The root directory for the Airflow content.This is the default parent directory for Airflow assets such as dags and logs.

AIRFLOW_VAR_{KEY}

Defines an Airflow variable.Replace the{KEY} placeholder with the variable name.

For more information, see:Managing Variables.

Was this entry helpful?


[8]ページ先頭

©2009-2025 Movatter.jp