|
2 | 2 |
|
3 | 3 | A framework for generating Apache Airflow DAGs from other workflows.
|
4 | 4 |
|
| 5 | +### Principles |
| 6 | + |
| 7 | +`cosmos` provides a framework for generating Apache Airflow DAGs from other workflows. Every provider comes with two main components: |
| 8 | + |
| 9 | +- `extractors`: These are responsible for extracting the workflow from the provider and converting it into `Task` and `Group` objects. |
| 10 | +- `operators`: These are used when the workflow is converted into a DAG. They are responsible for executing the tasks in the workflow. |
| 11 | + |
| 12 | +`cosmos` is not opinionated in the sense that it does not enforce any rendering method. Rather, it comes with the tools to render workflows as Airflow DAGs, task groups, or individual tasks. |
| 13 | + |
| 14 | +### Example Usage |
| 15 | + |
| 16 | +Imagine we have a dbt project located at `./dbt/my_project`. |
| 17 | + |
| 18 | +```python |
| 19 | +from cosmos.providers.dbt import DbtDag, DbtTaskGroup, DbtTask |
| 20 | + |
| 21 | +# render as a DAG |
| 22 | +dag = DbtDag( |
| 23 | + project_dir="./dbt/my_project", |
| 24 | + dag_id="my_dag", |
| 25 | + schedule_interval="@daily", |
| 26 | + default_args={"owner": "airflow"}, |
| 27 | +) |
| 28 | + |
| 29 | +# render as a task group |
| 30 | +with DAG("my_dag", default_args={"owner": "airflow"}) as dag: |
| 31 | + task_group = DbtTaskGroup( |
| 32 | + project_dir="./dbt/my_project", |
| 33 | + task_group_id="my_task_group", |
| 34 | + ) |
| 35 | + |
| 36 | +# render as an individual task |
| 37 | +with DAG("my_dag", default_args={"owner": "airflow"}) as dag: |
| 38 | + task = DbtTask( |
| 39 | + project_dir="./dbt/my_project", |
| 40 | + task_id="my_task", |
| 41 | + ) |
| 42 | +``` |
| 43 | + |
5 | 44 | ### Development
|
6 | 45 |
|
7 | 46 | We use pre-commit to run a number of checks on the code before committing. To install pre-commit, run:
|
|
0 commit comments