- Notifications
You must be signed in to change notification settings - Fork61
Apache Airflow - OpenApi Client for Python
License
apache/airflow-client-python
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
To facilitate management, Apache Airflow supports a range of REST API endpoints across itsobjects.This section provides an overview of the API design, methods, and supported use cases.
Most of the endpoints acceptJSON
as input and returnJSON
responses.This means that you must usually add the following headers to your request:
Content-type: application/jsonAccept: application/json
The termresource
refers to a single type of object in the Airflow metadata. An API is broken up by itsendpoint's corresponding resource.The name of a resource is typically plural and expressed in camelCase. Example:dagRuns
.
Resource names are used as part of endpoint URLs, as well as in API parameters and responses.
The platform supportsCreate,Read,Update, andDelete operations on most resources.You can review the standards for these operations and their standard parameters below.
Some endpoints have special behavior as exceptions.
To create a resource, you typically submit an HTTPPOST
request with the resource's required metadatain the request body.The response returns a201 Created
response code upon success with the resource's metadata, includingits internalid
, in the response body.
The HTTPGET
request can be used to read a resource or to list a number of resources.
A resource'sid
can be submitted in the request parameters to read a specific resource.The response usually returns a200 OK
response code upon success, with the resource's metadata inthe response body.
If aGET
request does not include a specific resourceid
, it is treated as a list request.The response usually returns a200 OK
response code upon success, with an object containing a listof resources' metadata in the response body.
When reading resources, some common query parameters are usually available. e.g.:
/api/v2/connections?limit=25&offset=25
Query Parameter | Type | Description |
---|---|---|
limit | integer | Maximum number of objects to fetch. Usually 25 by default |
offset | integer | Offset after which to start returning objects. For use with limit query parameter. |
Updating a resource requires the resourceid
, and is typically done using an HTTPPATCH
request,with the fields to modify in the request body.The response usually returns a200 OK
response code upon success, with information about the modifiedresource in the response body.
Deleting a resource requires the resourceid
and is typically executing via an HTTPDELETE
request.The response usually returns a204 No Content
response code upon success.
Resource names are plural and expressed in camelCase.
Names are consistent between URL parameter name and field name.
Field names are in snake_case.
{\"name\": \"string\",\"slots\": 0,\"occupied_slots\": 0,\"used_slots\": 0,\"queued_slots\": 0,\"open_slots\": 0}
Update mask is available as a query parameter in patch endpoints. It is used to notify theAPI which fields you want to update. Usingupdate_mask
makes it easier to update objectsby helping the server know which fields to update in an object instead of updating all fields.The update request ignores any fields that aren't specified in the field mask, leaving them withtheir current values.
Example:
importrequestsresource=requests.get("/resource/my-id").json()resource["my_field"]="new-value"requests.patch("/resource/my-id?update_mask=my_field",data=json.dumps(resource))
- API versioning is not synchronized to specific releases of the Apache Airflow.
- APIs are designed to be backward compatible.
- Any changes to the API will first go through a deprecation phase.
You can use a third party client, such ascurl,HTTPie,Postman orthe Insomnia rest client to testthe Apache Airflow API.
Note that you will need to pass authentication credentials. If your Airflow deployment supportsBearer token authentication, you can use the following example:
For example, here is how to pause a DAG withcurl
, using a Bearer token:
curl -X PATCH'https://example.com/api/v2/dags/{dag_id}?update_mask=is_paused' \ -H'Content-Type: application/json' \ -H'Authorization: Bearer YOUR_ACCESS_TOKEN' \ -d'{ \"is_paused\": true }'
Using a graphical tool such asPostman orInsomnia,it is possible to import the API specifications directly:
- Download the API specification by clicking theDownload button at top of this document.
- Import the JSON specification in the graphical tool of your choice.
- InPostman, you can click theimport button at the top
- WithInsomnia, you can just drag-and-drop the file on the UI
Note that withPostman, you can also generate code snippets by selecting a request and clicking ontheCode button.
Cross-origin resource sharing (CORS)is a browser security feature that restricts HTTP requests that areinitiated from scripts running in the browser.
For details on enabling/configuring CORS, seeEnabling CORS.
To be able to meet the requirements of many organizations, Airflow supports many authentication methods,and it is even possible to add your own method.
The default is to deny all requests.
For details on configuring the authentication, seeAPI Authorization.
We follow the error response format proposed inRFC 7807also known as Problem Details for HTTP APIs. As with our normal API responses,your client must be prepared to gracefully handle additional members of the response.
This indicates that the request has not been applied because it lacks valid authenticationcredentials for the target resource. Please check that you have valid credentials.
This response means that the server understood the request but refuses to authorizeit because it lacks sufficient rights to the resource. It happens when you do not have thenecessary permission to execute the action you performed. You need to get the appropriatepermissions in other to resolve this error.
This response means that the server cannot or will not process the request due to somethingthat is perceived to be a client error (e.g., malformed request syntax, invalid request messageframing, or deceptive request routing). To resolve this, please ensure that your syntax is correct.
This client error response indicates that the server cannot find the requested resource.
Indicates that the request method is known by the server but is not supported by the target resource.
The target resource does not have a current representation that would be acceptable to the useragent, according to the proactive negotiation header fields received in the request, and theserver is unwilling to supply a default representation.
The request could not be completed due to a conflict with the current state of the targetresource, e.g. the resource it tries to create already exists.
This means that the server encountered an unexpected condition that prevented it fromfulfilling the request.
This Python package is automatically generated by theOpenAPI Generator project:
- API version: 2.9.0
- Package version: 2.9.0
- Build package: org.openapitools.codegen.languages.PythonClientCodegen
For more information, please visithttps://airflow.apache.org
Python >=3.9
You can install the client using standard Python installation tools. It is hostedin PyPI withapache-airflow-client
package id so the easiest way to get the latestversion is to run:
pip install apache-airflow-client
If the python package is hosted on a repository, you can install directly using:
pip install git+https://github.com/apache/airflow-client-python.git
Then import the package:
importairflow_client.client
Before attempting the following examples ensure you have an account with API access.As an example you can create an account for usage with the API as follows using the Airflow CLI.
airflow users create -u admin-api -e admin-api@example.com -f admin-api -l admin-api -p$PASSWORD -r Admin
Please follow theinstallation procedure and then run the following:
importairflow_client.clientimportrequestsfromairflow_client.client.restimportApiExceptionfrompprintimportpprintfrompydanticimportBaseModel# What we expect back from auth/tokenclassAirflowAccessTokenResponse(BaseModel):access_token:str# An optional helper function to retrieve an access tokendefget_airflow_client_access_token(host:str,username:str,password:str,)->str:url=f"{host}/auth/token"payload= {"username":username,"password":password, }headers= {"Content-Type":"application/json"}response=requests.post(url,json=payload,headers=headers)ifresponse.status_code!=201:raiseRuntimeError(f"Failed to get access token:{response.status_code}{response.text}")response_success=AirflowAccessTokenResponse(**response.json())returnresponse_success.access_token# Defining the host is optional and defaults to http://localhost# See configuration.py for a list of all supported configuration parameters.host="http://localhost"configuration=airflow_client.client.Configuration(host=host)# The client must configure the authentication and authorization parameters# in accordance with the API server security policy.# Examples for each auth method are provided below, use the example that# satisfies your auth use case.configuration.access_token=get_airflow_client_access_token(host=host,username="admin-api",password=os.environ["PASSWORD"],)# Enter a context with an instance of the API clientwithairflow_client.client.ApiClient(configuration)asapi_client:# Create an instance of the API classapi_instance=airflow_client.client.AssetApi(api_client)create_asset_events_body=airflow_client.client.CreateAssetEventsBody()# CreateAssetEventsBody |try:# Create Asset Eventapi_response=api_instance.create_asset_event(create_asset_events_body)print("The response of AssetApi->create_asset_event:\n")pprint(api_response)exceptApiExceptionase:print("Exception when calling AssetApi->create_asset_event: %s\n"%e)
All URIs are relative tohttp://localhost
Class | Method | HTTP request | Description |
---|---|---|---|
AssetApi | create_asset_event | POST /api/v2/assets/events | Create Asset Event |
AssetApi | delete_asset_queued_events | DELETE /api/v2/assets/{asset_id}/queuedEvents | Delete Asset Queued Events |
AssetApi | delete_dag_asset_queued_event | DELETE /api/v2/dags/{dag_id}/assets/{asset_id}/queuedEvents | Delete Dag Asset Queued Event |
AssetApi | delete_dag_asset_queued_events | DELETE /api/v2/dags/{dag_id}/assets/queuedEvents | Delete Dag Asset Queued Events |
AssetApi | get_asset | GET /api/v2/assets/{asset_id} | Get Asset |
AssetApi | get_asset_alias | GET /api/v2/assets/aliases/{asset_alias_id} | Get Asset Alias |
AssetApi | get_asset_aliases | GET /api/v2/assets/aliases | Get Asset Aliases |
AssetApi | get_asset_events | GET /api/v2/assets/events | Get Asset Events |
AssetApi | get_asset_queued_events | GET /api/v2/assets/{asset_id}/queuedEvents | Get Asset Queued Events |
AssetApi | get_assets | GET /api/v2/assets | Get Assets |
AssetApi | get_dag_asset_queued_event | GET /api/v2/dags/{dag_id}/assets/{asset_id}/queuedEvents | Get Dag Asset Queued Event |
AssetApi | get_dag_asset_queued_events | GET /api/v2/dags/{dag_id}/assets/queuedEvents | Get Dag Asset Queued Events |
AssetApi | materialize_asset | POST /api/v2/assets/{asset_id}/materialize | Materialize Asset |
BackfillApi | cancel_backfill | PUT /api/v2/backfills/{backfill_id}/cancel | Cancel Backfill |
BackfillApi | create_backfill | POST /api/v2/backfills | Create Backfill |
BackfillApi | create_backfill_dry_run | POST /api/v2/backfills/dry_run | Create Backfill Dry Run |
BackfillApi | get_backfill | GET /api/v2/backfills/{backfill_id} | Get Backfill |
BackfillApi | list_backfills | GET /api/v2/backfills | List Backfills |
BackfillApi | pause_backfill | PUT /api/v2/backfills/{backfill_id}/pause | Pause Backfill |
BackfillApi | unpause_backfill | PUT /api/v2/backfills/{backfill_id}/unpause | Unpause Backfill |
ConfigApi | get_config | GET /api/v2/config | Get Config |
ConfigApi | get_config_value | GET /api/v2/config/section/{section}/option/{option} | Get Config Value |
ConnectionApi | bulk_connections | PATCH /api/v2/connections | Bulk Connections |
ConnectionApi | create_default_connections | POST /api/v2/connections/defaults | Create Default Connections |
ConnectionApi | delete_connection | DELETE /api/v2/connections/{connection_id} | Delete Connection |
ConnectionApi | get_connection | GET /api/v2/connections/{connection_id} | Get Connection |
ConnectionApi | get_connections | GET /api/v2/connections | Get Connections |
ConnectionApi | patch_connection | PATCH /api/v2/connections/{connection_id} | Patch Connection |
ConnectionApi | post_connection | POST /api/v2/connections | Post Connection |
ConnectionApi | test_connection | POST /api/v2/connections/test | Test Connection |
DAGApi | delete_dag | DELETE /api/v2/dags/{dag_id} | Delete Dag |
DAGApi | get_dag | GET /api/v2/dags/{dag_id} | Get Dag |
DAGApi | get_dag_details | GET /api/v2/dags/{dag_id}/details | Get Dag Details |
DAGApi | get_dag_tags | GET /api/v2/dagTags | Get Dag Tags |
DAGApi | get_dags | GET /api/v2/dags | Get Dags |
DAGApi | patch_dag | PATCH /api/v2/dags/{dag_id} | Patch Dag |
DAGApi | patch_dags | PATCH /api/v2/dags | Patch Dags |
DAGParsingApi | reparse_dag_file | PUT /api/v2/parseDagFile/{file_token} | Reparse Dag File |
DagReportApi | get_dag_reports | GET /api/v2/dagReports | Get Dag Reports |
DagRunApi | clear_dag_run | POST /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/clear | Clear Dag Run |
DagRunApi | delete_dag_run | DELETE /api/v2/dags/{dag_id}/dagRuns/{dag_run_id} | Delete Dag Run |
DagRunApi | get_dag_run | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id} | Get Dag Run |
DagRunApi | get_dag_runs | GET /api/v2/dags/{dag_id}/dagRuns | Get Dag Runs |
DagRunApi | get_list_dag_runs_batch | POST /api/v2/dags/{dag_id}/dagRuns/list | Get List Dag Runs Batch |
DagRunApi | get_upstream_asset_events | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamAssetEvents | Get Upstream Asset Events |
DagRunApi | patch_dag_run | PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id} | Patch Dag Run |
DagRunApi | trigger_dag_run | POST /api/v2/dags/{dag_id}/dagRuns | Trigger Dag Run |
DagSourceApi | get_dag_source | GET /api/v2/dagSources/{dag_id} | Get Dag Source |
DagStatsApi | get_dag_stats | GET /api/v2/dagStats | Get Dag Stats |
DagVersionApi | get_dag_version | GET /api/v2/dags/{dag_id}/dagVersions/{version_number} | Get Dag Version |
DagVersionApi | get_dag_versions | GET /api/v2/dags/{dag_id}/dagVersions | Get Dag Versions |
DagWarningApi | list_dag_warnings | GET /api/v2/dagWarnings | List Dag Warnings |
EventLogApi | get_event_log | GET /api/v2/eventLogs/{event_log_id} | Get Event Log |
EventLogApi | get_event_logs | GET /api/v2/eventLogs | Get Event Logs |
ExtraLinksApi | get_extra_links | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/links | Get Extra Links |
ImportErrorApi | get_import_error | GET /api/v2/importErrors/{import_error_id} | Get Import Error |
ImportErrorApi | get_import_errors | GET /api/v2/importErrors | Get Import Errors |
JobApi | get_jobs | GET /api/v2/jobs | Get Jobs |
LoginApi | login | GET /api/v2/auth/login | Login |
LoginApi | logout | GET /api/v2/auth/logout | Logout |
MonitorApi | get_health | GET /api/v2/monitor/health | Get Health |
PluginApi | get_plugins | GET /api/v2/plugins | Get Plugins |
PoolApi | bulk_pools | PATCH /api/v2/pools | Bulk Pools |
PoolApi | delete_pool | DELETE /api/v2/pools/{pool_name} | Delete Pool |
PoolApi | get_pool | GET /api/v2/pools/{pool_name} | Get Pool |
PoolApi | get_pools | GET /api/v2/pools | Get Pools |
PoolApi | patch_pool | PATCH /api/v2/pools/{pool_name} | Patch Pool |
PoolApi | post_pool | POST /api/v2/pools | Post Pool |
ProviderApi | get_providers | GET /api/v2/providers | Get Providers |
TaskApi | get_task | GET /api/v2/dags/{dag_id}/tasks/{task_id} | Get Task |
TaskApi | get_tasks | GET /api/v2/dags/{dag_id}/tasks | Get Tasks |
TaskInstanceApi | get_extra_links | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/links | Get Extra Links |
TaskInstanceApi | get_log | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/logs/{try_number} | Get Log |
TaskInstanceApi | get_mapped_task_instance | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index} | Get Mapped Task Instance |
TaskInstanceApi | get_mapped_task_instance_tries | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/tries | Get Mapped Task Instance Tries |
TaskInstanceApi | get_mapped_task_instance_try_details | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/tries/{task_try_number} | Get Mapped Task Instance Try Details |
TaskInstanceApi | get_mapped_task_instances | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/listMapped | Get Mapped Task Instances |
TaskInstanceApi | get_task_instance | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} | Get Task Instance |
TaskInstanceApi | get_task_instance_dependencies | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/dependencies | Get Task Instance Dependencies |
TaskInstanceApi | get_task_instance_dependencies_by_map_index | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/dependencies | Get Task Instance Dependencies |
TaskInstanceApi | get_task_instance_tries | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/tries | Get Task Instance Tries |
TaskInstanceApi | get_task_instance_try_details | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/tries/{task_try_number} | Get Task Instance Try Details |
TaskInstanceApi | get_task_instances | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances | Get Task Instances |
TaskInstanceApi | get_task_instances_batch | POST /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/list | Get Task Instances Batch |
TaskInstanceApi | patch_task_instance | PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} | Patch Task Instance |
TaskInstanceApi | patch_task_instance_by_map_index | PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index} | Patch Task Instance |
TaskInstanceApi | patch_task_instance_dry_run | PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/dry_run | Patch Task Instance Dry Run |
TaskInstanceApi | patch_task_instance_dry_run_by_map_index | PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/dry_run | Patch Task Instance Dry Run |
TaskInstanceApi | post_clear_task_instances | POST /api/v2/dags/{dag_id}/clearTaskInstances | Post Clear Task Instances |
VariableApi | bulk_variables | PATCH /api/v2/variables | Bulk Variables |
VariableApi | delete_variable | DELETE /api/v2/variables/{variable_key} | Delete Variable |
VariableApi | get_variable | GET /api/v2/variables/{variable_key} | Get Variable |
VariableApi | get_variables | GET /api/v2/variables | Get Variables |
VariableApi | patch_variable | PATCH /api/v2/variables/{variable_key} | Patch Variable |
VariableApi | post_variable | POST /api/v2/variables | Post Variable |
VersionApi | get_version | GET /api/v2/version | Get Version |
XComApi | create_xcom_entry | POST /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries | Create Xcom Entry |
XComApi | get_xcom_entries | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries | Get Xcom Entries |
XComApi | get_xcom_entry | GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key} | Get Xcom Entry |
XComApi | update_xcom_entry | PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key} | Update Xcom Entry |
- AppBuilderMenuItemResponse
- AppBuilderViewResponse
- AssetAliasCollectionResponse
- AssetAliasResponse
- AssetCollectionResponse
- AssetEventCollectionResponse
- AssetEventResponse
- AssetResponse
- BackfillCollectionResponse
- BackfillPostBody
- BackfillResponse
- BaseInfoResponse
- BulkAction
- BulkActionNotOnExistence
- BulkActionOnExistence
- BulkActionResponse
- BulkBodyConnectionBody
- BulkBodyConnectionBodyActionsInner
- BulkBodyPoolBody
- BulkBodyPoolBodyActionsInner
- BulkBodyVariableBody
- BulkBodyVariableBodyActionsInner
- BulkCreateActionConnectionBody
- BulkCreateActionPoolBody
- BulkCreateActionVariableBody
- BulkDeleteActionConnectionBody
- BulkDeleteActionPoolBody
- BulkDeleteActionVariableBody
- BulkResponse
- BulkUpdateActionConnectionBody
- BulkUpdateActionPoolBody
- BulkUpdateActionVariableBody
- ClearTaskInstancesBody
- ClearTaskInstancesBodyTaskIdsInner
- Config
- ConfigOption
- ConfigSection
- ConnectionBody
- ConnectionCollectionResponse
- ConnectionResponse
- ConnectionTestResponse
- Content
- CreateAssetEventsBody
- DAGCollectionResponse
- DAGDetailsResponse
- DAGPatchBody
- DAGResponse
- DAGRunClearBody
- DAGRunCollectionResponse
- DAGRunPatchBody
- DAGRunPatchStates
- DAGRunResponse
- DAGRunsBatchBody
- DAGSourceResponse
- DAGTagCollectionResponse
- DAGVersionCollectionResponse
- DAGWarningCollectionResponse
- DAGWarningResponse
- DagProcessorInfoResponse
- DagRunAssetReference
- DagRunState
- DagRunTriggeredByType
- DagRunType
- DagScheduleAssetReference
- DagStatsCollectionResponse
- DagStatsResponse
- DagStatsStateResponse
- DagTagResponse
- DagVersionResponse
- DagWarningType
- Detail
- DryRunBackfillCollectionResponse
- DryRunBackfillResponse
- EventLogCollectionResponse
- EventLogResponse
- ExtraLinkCollectionResponse
- FastAPIAppResponse
- FastAPIRootMiddlewareResponse
- HTTPExceptionResponse
- HTTPValidationError
- HealthInfoResponse
- ImportErrorCollectionResponse
- ImportErrorResponse
- JobCollectionResponse
- JobResponse
- PatchTaskInstanceBody
- PluginCollectionResponse
- PluginResponse
- PoolBody
- PoolCollectionResponse
- PoolPatchBody
- PoolResponse
- ProviderCollectionResponse
- ProviderResponse
- QueuedEventCollectionResponse
- QueuedEventResponse
- ReprocessBehavior
- ResponseClearDagRun
- ResponseGetXcomEntry
- SchedulerInfoResponse
- StructuredLogMessage
- TaskCollectionResponse
- TaskDependencyCollectionResponse
- TaskDependencyResponse
- TaskInstanceCollectionResponse
- TaskInstanceHistoryCollectionResponse
- TaskInstanceHistoryResponse
- TaskInstanceResponse
- TaskInstanceState
- TaskInstancesBatchBody
- TaskInstancesLogResponse
- TaskOutletAssetReference
- TaskResponse
- TimeDelta
- TriggerDAGRunPostBody
- TriggerResponse
- TriggererInfoResponse
- ValidationError
- ValidationErrorLocInner
- Value
- VariableBody
- VariableCollectionResponse
- VariableResponse
- VersionInfo
- XComCollectionResponse
- XComCreateBody
- XComResponse
- XComResponseNative
- XComResponseString
- XComUpdateBody
By default the generated client supports the three authentication schemes:
- Basic
- GoogleOpenID
- Kerberos
- OAuth2PasswordBearer
However, you can generate client and documentation with your own schemes by adding your own schemes inthe security section of the OpenAPI specification. You can do it with Breeze CLI by adding the--security-schemes
option to thebreeze release-management prepare-python-client
command.
You can run basic smoke tests to check if the client is working properly - we have a simple test scriptthat uses the API to run the tests. To do that, you need to:
- install the
apache-airflow-client
package as described above - install
rich
Python package - download thetest_python_client.py file
- make sure you have test airflow installation running. Do not experiment with your production deployment
- configure your airflow webserver to enable basic authenticationIn the
[api]
section of yourairflow.cfg
set:
[api]auth_backend = airflow.providers.fab.auth_manager.api.auth.backend.session,airflow.providers.fab.auth_manager.api.auth.backend.basic_auth
You can also set it by env variable:export AIRFLOW__API__AUTH_BACKENDS=airflow.providers.fab.auth_manager.api.auth.backend.session,airflow.providers.fab.auth_manager.api.auth.backend.basic_auth
- configure your airflow webserver to load example dagsIn the
[core]
section of yourairflow.cfg
set:
[core]load_examples = True
You can also set it by env variable:export AIRFLOW__CORE__LOAD_EXAMPLES=True
- optionally expose configuration (NOTE! that this is dangerous setting). The script will happily run withthe default setting, but if you want to see the configuration, you need to expose it.In the
[api]
section of yourairflow.cfg
set:
[api]expose_config = True
You can also set it by env variable:export AIRFLOW__API__EXPOSE_CONFIG=True
- Configure your host/ip/user/password in the
test_python_client.py
file
importairflow_client# get the access token from Airflow API Server via /auth/tokenconfiguration=airflow_client.client.Configuration(host="http://localhost:8080",access_token=access_token)
Run scheduler (or dag file processor you have setup with standalone dag file processor) for few parsingloops (you can pass --num-runs parameter to it or keep it running in the background). The script relieson example DAGs being serialized to the DB and this onlyhappens when scheduler runs with
core/load_examples
set to True.Run webserver - reachable at the host/port for the test script you want to run. Make sure it had enoughtime to initialize.
Runpython test_python_client.py
and you should see colored output showing attempts to connect and status.
If the OpenAPI document is large, imports in client.apis and client.models may fail with aRecursionError indicating the maximum recursion limit has been exceeded. In that case, there are a couple of solutions:
Solution 1:Use specific imports for apis and models like:
from airflow_client.client.api.default_api import DefaultApi
from airflow_client.client.model.pet import Pet
Solution 2:Before importing the package, adjust the maximum recursion limit as shown below:
importsyssys.setrecursionlimit(1500)importairflow_client.clientfromairflow_client.client.apiimport*fromairflow_client.client.modelsimport*
About
Apache Airflow - OpenApi Client for Python
Topics
Resources
License
Code of conduct
Security policy
Uh oh!
There was an error while loading.Please reload this page.