Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/shegzimus/de_nasa_neow_pipeline
Airflow powered ETL pipeline for moving Near-Earth-Object data from NASA to Google Cloud
https://github.com/shegzimus/de_nasa_neow_pipeline
airflow-dag airflow-operator airflow-providers bigquery celery-redis docker docker-compose docker-container google-cloud-platform googlecloudstorage nasa-api
Last synced: about 1 month ago
JSON representation
Airflow powered ETL pipeline for moving Near-Earth-Object data from NASA to Google Cloud
- Host: GitHub
- URL: https://github.com/shegzimus/de_nasa_neow_pipeline
- Owner: Shegzimus
- License: apache-2.0
- Created: 2024-11-20T05:41:53.000Z (about 1 month ago)
- Default Branch: main
- Last Pushed: 2024-11-27T16:39:27.000Z (about 1 month ago)
- Last Synced: 2024-11-27T17:33:58.270Z (about 1 month ago)
- Topics: airflow-dag, airflow-operator, airflow-providers, bigquery, celery-redis, docker, docker-compose, docker-container, google-cloud-platform, googlecloudstorage, nasa-api
- Language: Python
- Homepage:
- Size: 855 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Table of Contents
- [Table of Contents](#table-of-contents)
- [Motivation and Objectives](#motivation-and-objectives)
- [Overview](#overview)
- [Architecture](#architecture)
- [New Personal Insights](#new-personal-insights)
- [Prerequisites](#prerequisites)
- [System Configuration](#system-configuration)## Motivation and Objectives
## Overview
## Architecture
## New Personal Insights
## Prerequisites
## System Configuration
1. Clone the repository
```bash
git clone https://github.com/Shegzimus/DE_NASA_NeoW_Pipeline
```2. Create a virtual environment in your local machine
```bash
python3 -m venv venv
```3. Activate the virtual environment
```bash
source venv/bin/activate
```4. Install dependencies
```bash
pip install -r airflow/requirements.txt
```5. Create directories to store your google credentials
```bash
cd airflow && mkdir -p .google```
9. Build the Docker Image
```bash
docker build -d --
```10. Start the Docker containers
```bash
docker-compose up -d
```11. Launch the Airflow web UI
```bash
open http://localhost:8083