Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/qubole/afctl
afctl helps to manage and deploy Apache Airflow projects faster and smoother.
https://github.com/qubole/afctl
airflow cli deployment docker management
Last synced: 3 months ago
JSON representation
afctl helps to manage and deploy Apache Airflow projects faster and smoother.
- Host: GitHub
- URL: https://github.com/qubole/afctl
- Owner: qubole
- License: apache-2.0
- Created: 2019-12-24T08:47:50.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2022-09-16T18:15:44.000Z (about 2 years ago)
- Last Synced: 2024-04-28T03:59:09.159Z (6 months ago)
- Topics: airflow, cli, deployment, docker, management
- Language: Python
- Homepage:
- Size: 76.2 KB
- Stars: 130
- Watchers: 14
- Forks: 7
- Open Issues: 10
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-apache-airflow - afctl - A CLI tool that includes everything required to create, manage and deploy airflow projects faster and smoother. (Libraries, Hooks, Utilities)
README
# afctl
The proposed CLI tool is authored to make creating and deployment of Apache Airflow (https://airflow.apache.org/) projects faster and smoother.
As of now, there is no tool out there that can empower the user to create a boilerplate code structure for airflow
projects and make development + deployment of projects seamless.## Requirements
* Python 3.5+
* Docker## Getting Started
### 1. Installation
Create a new python virtualenv. You can use the following command.
```bash
python3 -m venv
```
Activate your virtualenv
```bash
source /path_to_venv/bin/activate
``````bash
pip3 install afctl
```### 2. Initialize a new afctl project.
The project is created in your present working directory. Along with this a configuration file with the same name is
generated in **/home/.afctl_configs** directory.```bash
afctl init
```
Eg.
```bash
afctl init project_demo
```
* The following directory structure will be generated
```bash
.
├── deployments
│ └── project_demo-docker-compose.yml
├── migrations
├── plugins
├── project_demo
│ ├── commons
│ └── dags
├── requirements.txt
└── tests
```If you already have a git repository and want to turn it into an afctl project.
Run the following command :-
```bash
afctl init .
```### 3. Add a new module in the project.
```bash
afctl generate module -n
```The following directory structure will be generated :
```bash
afctl generate module -n first_module
afctl generate module -n second_module.
├── deployments
│ └── project_demo-docker-compose.yml
├── migrations
├── plugins
├── project_demo
│ ├── commons
│ └── dags
│ ├── first_module
│ └── second_module
├── requirements.txt
└── tests
├── first_module
└── second_module```
### 4. Generate dag
```bash
afctl generate dag -n -m
```The following directory structure will be generate :
```bash
afctl generate dag -n new -m first_module.
├── deployments
│ └── project_demo-docker-compose.yml
├── migrations
├── plugins
├── project_demo
│ ├── commons
│ └── dags
│ ├── first_module
│ │ └── new_dag.py
│ └── second_module
├── requirements.txt
└── tests
├── first_module
└── second_module
```The dag file will look like this :
```python
from airflow import DAG
from datetime import datetime, timedeltadefault_args = {
'owner': 'project_demo',
# 'depends_on_past': ,
# 'start_date': ,
# 'email': ,
# 'email_on_failure': ,
# 'email_on_retry': ,
# 'retries': 0}
dag = DAG(dag_id='new', default_args=default_args, schedule_interval='@once')
```### 5. Deploy project locally
You can add python packages that will be required by your dags in `requirements.txt`. They will automatically get
installed.* To deploy your project, run the following command (make sure docker is running) :
```bash
afctl deploy local
```If you do not want to see the logs, you can run
```bash
afctl deploy local -d
```
This will run it in detached mode and won't print the logs on the console.* You can access your airflow webserver on browser at `localhost:8080`
### 6. Deploy project on production
* Here we will be deploying our project to **Qubole**. Sign up at us.qubole.com.
* add git-origin and access-token (if want to keep the project as private repo
on Github) to the configs. [See how](#manage-configurations)
* Push the project once completed to Github.
* Deploying to Qubole will require adding deployment configurations.```bash
afctl config add -d qubole -n -e -c -t
```
This command will modify your config file. You can see your config file with the following command :
```bash
afctl config show
```For example -
```bash
afctl config add -d qubole -n demo -e https://api.qubole.com -c airflow_1102 -t khd34djs3
```* To deploy run the following command
```bash
afctl deploy qubole -n
```### The following video also contains all the steps of deploying project using afctl -
https://www.youtube.com/watch?v=A4rcZDGtJME&feature=youtu.be## Manage configurations
The configuration file is used for deployment contains the following information.
```yaml
global:
-airflow_version:
-git:
--origin:
--access-token:
deployment:
-qubole:
--local:
---compose:
```* `airflow_version` can be added to the project when you initialize the project.
```bash
afctl init -v
```* global configs (airflow_version, origin, access-token) can all be added/ updated with the following command :
```bash
afctl config global -o -t -v
```## Usage
Commands right now supported are
* init
* config
* deploy
* list
* generateTo learn more, run
```bash
afctl -h
```### Caution
Not yet ported for Windows.#### Credits
Docker-compose file : https://github.com/puckel/docker-airflow