An open API service indexing awesome lists of open source software.

https://github.com/ignacionmiranda/trigger-dag-run-params-demo

Apache Airflow demo project that setup 3 DAGs to explain how to pass parameters from a DAG to a triggered DAG.
https://github.com/ignacionmiranda/trigger-dag-run-params-demo

airflow-dag airflow-dags airflow-operator airflow-operators apache-airflow python python3

Last synced: 2 months ago
JSON representation

Apache Airflow demo project that setup 3 DAGs to explain how to pass parameters from a DAG to a triggered DAG.

Awesome Lists containing this project

README

        

# trigger-dag-run-params-demo

Apache Airflow demo project that setup 3 DAGs to explain how to pass parameters from a DAG to a triggered DAG:

1. Wrapper DAG: It triggers sync dags using the TriggerDagRunOperator passing config parameters.
2. Sync DAG: Example DAG that syncs data from a data source to another one.

This project was generated using 'astro dev init' using the Astronomer CLI. This readme describes the contents of the project, as well as how to run Apache Airflow on your local machine.

[Click here](https://igna.hashnode.dev/passing-params-from-an-apache-airflow-dag-to-triggered-dags-using-triggerdagrunoperator) to visit the blog post associated with this repository.

## Project Contents

The project contains the following files and folders:

- dags: This folder contains the Python files for your Airflow DAGs.
- Dockerfile: This file contains a versioned Astro Runtime Docker image that provides a differentiated Airflow experience. If you want to execute other commands or overrides at runtime, specify them here.
- include: This folder contains any additional files that you want to include as part of your project. It is empty by default.
- packages.txt: Install OS-level packages needed for your project by adding them to this file. It is empty by default.
- requirements.txt: Install Python packages needed for your project by adding them to this file. It is empty by default.
- plugins: Add custom or community plugins for your project to this file. It is empty by default.
- airflow_settings.yaml: Use this local-only file to specify Airflow Connections, Variables, and Pools instead of entering them in the Airflow UI as you develop DAGs in this project.

## Deploy Your Project Locally

1. Start Airflow on your local machine by running 'astro dev start'.

This command will spin up 3 Docker containers on your machine, each for a different Airflow component:

- Postgres: Airflow's Metadata Database
- Webserver: The Airflow component responsible for rendering the Airflow UI
- Scheduler: The Airflow component responsible for monitoring and triggering tasks

2. Verify that all 3 Docker containers were created by running 'docker ps'.

Note: Running 'astro dev start' will start your project with the Airflow Webserver exposed at port 8080 and Postgres exposed at port 5432. If you already have either of those ports allocated, you can either stop your existing Docker containers or change the port.

3. Access the Airflow UI for your local Airflow project. To do so, go to [localhost:8080](http://localhost:8080/) and log in with 'admin' for both your Username and Password.

You should also be able to access your Postgres Database at 'localhost:5432/postgres'.