Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ajschofield/de-project-bentley
A project demonstrating an ETL pipeline primarily using AWS infrastructure into a data warehouse
https://github.com/ajschofield/de-project-bentley
aws eventbridge lambda northcoders python rds-postgres step-functions terraform
Last synced: 25 days ago
JSON representation
A project demonstrating an ETL pipeline primarily using AWS infrastructure into a data warehouse
- Host: GitHub
- URL: https://github.com/ajschofield/de-project-bentley
- Owner: ajschofield
- Archived: true
- Created: 2024-08-12T08:36:45.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-09-03T15:25:03.000Z (5 months ago)
- Last Synced: 2024-09-28T13:21:31.142Z (5 months ago)
- Topics: aws, eventbridge, lambda, northcoders, python, rds-postgres, step-functions, terraform
- Language: Python
- Homepage:
- Size: 372 KB
- Stars: 2
- Watchers: 1
- Forks: 4
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
> [!NOTE]
> Considering that myself and my team have graduated from the Northcoders Data Engineering course, this project will be archived and made read-only.
> I will be continuing this project solo, which you can find [here](https://github.com/ajschofield/ETL-Project), where I will be adding more features
> over time.# ToteSys - Data Engineering Project
[![Python](https://img.shields.io/badge/Python-FFD43B?style=for-the-badge&logo=python&logoColor=blue)](https://www.python.org/)
[![AWS](https://img.shields.io/badge/Amazon_AWS-FF9900?style=for-the-badge&logo=amazonaws&logoColor=white)](https://aws.amazon.com/)
[![Terraform](https://img.shields.io/badge/Terraform-7B42BC?style=for-the-badge&logo=terraform&logoColor=white)](https://www.terraform.io/)
[![Postgresql](https://img.shields.io/badge/PostgreSQL-316192?style=for-the-badge&logo=postgresql&logoColor=white)](https://www.postgresql.org/)
[![GitHub Actions](https://img.shields.io/badge/GitHub_Actions-2088FF?style=for-the-badge&logo=github-actions&logoColor=white)](https://github.com/features/actions)[![Terraform Main Deployment Workflow Status](https://img.shields.io/github/actions/workflow/status/ajschofield/de-project-bentley/deploy.yml?branch=main&style=flat-square&label=deploy)](https://github.com/ajschofield/de-project-bentley/actions/workflows/deploy.yml?query=branch%3Amain)
[![Production Environment Status](https://img.shields.io/github/deployments/ajschofield/de-project-bentley/production?style=flat-square&label=env)](https://github.com/ajschofield/de-project-bentley/deployments/production)# Contributors
![]()
Ellie Symonds
![]()
Lianmei Manon-og
![]()
Tolu Ajibade
![]()
Joslin Rashleigh
![]()
Anzelika Belotelova
![]()
Alex Schofield
# Summary
The project aims to implement a data platform that can extract data from an
operational database, archive it in a data lake, and make it easily accessible
within a remodelled OLAP data warehouse.The solution showcases our skills in:
- Python
- PostgreSQL
- Database modelling
- Amazon Web Services (AWS)
- Agile methodologies# Main Objectives
Our goal is to create a reliable ETL (Extract, Transform, Load) pipeline that
can:1. Extract the data from the `totesys` operational database
2. Store the data in AWS S3 buckets, that will form our data lake
3. Transform the data into a suitable schema for the data warehouse
4. Load the transformed data into the data warehouse hosted on AWS# Key Features
We aim for the project to have certain features. Some are more prioritised than
others.- Automated data ingestion from `totesys` db
- Data storage for ingested and processed data in S3 buckets
- Data transformation for data warehouse schema
- Automated data loading into the data warehouse schema
- Logging and monitoring with CloudWatch
- Notifications for errors and successful runs (e.g. successful ingestion)
- Visualisation of warehouse data