https://github.com/questdb/data-orchestration-and-scheduling-samples
Data Orchestration and Scheduling Samples with QuestDB using Bash, Airflow, and Dagster
https://github.com/questdb/data-orchestration-and-scheduling-samples
Last synced: 7 months ago
JSON representation
Data Orchestration and Scheduling Samples with QuestDB using Bash, Airflow, and Dagster
- Host: GitHub
- URL: https://github.com/questdb/data-orchestration-and-scheduling-samples
- Owner: questdb
- License: apache-2.0
- Created: 2025-01-16T08:49:44.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2025-02-11T10:44:01.000Z (11 months ago)
- Last Synced: 2025-06-06T22:11:16.719Z (8 months ago)
- Language: Python
- Size: 26.4 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Data Orchestration and Scheduling Samples
This repository contains examples of data orchestration and scheduling using different tools:
- **Airflow**
- **Dagster**
- **Bash and Cron**
Each folder contains an example implementation of a data pipeline designed to:
1. Export a partition from QuestDB.
2. Convert the partition to Parquet format.
3. Upload the Parquet file to S3.
4. Delete the local partition folder.
## Folder Structure
- **airflow**: Example implementation using Apache Airflow.
- **dagster**: Example implementation using Dagster.
- **bash_cron**: Example implementation using a simple Bash script and Cron.
## Getting Started
1. Navigate to the folder of your preferred tool.
2. Follow the specific instructions in its `README.md` file to set up and run the example.
### Goals
This repository aims to demonstrate:
- How to use modern orchestration tools for real-world ETL scenarios.
- Comparing flexibility and configuration between Airflow, Dagster, and a lightweight Bash + Cron approach.
- Helping you decide which orchestration tool is suitable for your workflows.
Feel free to explore, adapt, and contribute to these examples!