- Notifications
You must be signed in to change notification settings - Fork203
Run your dbt Core projects as Apache Airflow DAGs and Task Groups with a few lines of code
License
astronomer/astronomer-cosmos
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Run your dbt Core projects asApache Airflow® DAGs and Task Groups with a few lines of code. Benefits include:
- Run dbt projects against Airflow connections instead of dbt profiles
- Native support for installing and running dbt in a virtual environment to avoid dependency conflicts with Airflow
- Run tests immediately after a model is done to catch issues early
- Utilize Airflow's data-aware scheduling to run models immediately after upstream ingestion
- Turn each dbt model into a task/task group complete with retries, alerting, etc.
Check out the Getting Started guide on ourdocs. See more examples at/dev/dags and at thecosmos-demo repo.
You can render a Cosmos Airflow DAG using theDbtDag
class. Here's an example with thejaffle_shop project:
astronomer-cosmos/dev/dags/basic_cosmos_dag.py
Lines 1 to 42 in24aa38e
""" | |
An example DAG that uses Cosmos to render a dbt project. | |
""" | |
importos | |
fromdatetimeimportdatetime | |
frompathlibimportPath | |
fromcosmosimportDbtDag,ProjectConfig,ProfileConfig | |
fromcosmos.profilesimportPostgresUserPasswordProfileMapping | |
DEFAULT_DBT_ROOT_PATH=Path(__file__).parent/"dbt" | |
DBT_ROOT_PATH=Path(os.getenv("DBT_ROOT_PATH",DEFAULT_DBT_ROOT_PATH)) | |
profile_config=ProfileConfig( | |
profile_name="default", | |
target_name="dev", | |
profile_mapping=PostgresUserPasswordProfileMapping( | |
conn_id="airflow_db", | |
profile_args={"schema":"public"}, | |
), | |
) | |
# [START local_example] | |
basic_cosmos_dag=DbtDag( | |
# dbt/cosmos-specific parameters | |
project_config=ProjectConfig( | |
DBT_ROOT_PATH/"jaffle_shop", | |
), | |
profile_config=profile_config, | |
operator_args={ | |
"install_deps":True,# install any necessary dependencies before running any dbt command | |
"full_refresh":True,# used only in dbt commands that support this flag | |
}, | |
# normal dag parameters | |
schedule_interval="@daily", | |
start_date=datetime(2023,1,1), | |
catchup=False, | |
dag_id="basic_cosmos_dag", | |
default_args={"retries":2}, | |
) | |
# [END local_example] |
This will generate an Airflow DAG that looks like this:
- Join us on the AirflowSlack at #airflow-dbt
We followSemantic Versioning for releases.CheckCHANGELOG.rstfor the latest changes.
All contributions, bug reports, bug fixes, documentation improvements, enhancements are welcome.
A detailed overview an how to contribute can be found in theContributing Guide.
As contributors and maintainers to this project, you are expected to abide by theContributor Code of Conduct.
The application and this website collect telemetry to support the project's development. These can be disabled by the end-users.
Read thePrivacy Notice to learn more about it.
Check the project'sSecurity Policy to learnhow to report security vulnerabilities in Astronomer Cosmos and how security issues reported to the Astronomer Cosmossecurity team are handled.
About
Run your dbt Core projects as Apache Airflow DAGs and Task Groups with a few lines of code