Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/sudoblark/sudoblark.terraform.modularised-demo
An example Terraform setup using modularised components to fulfill a use-case - repo managed by sudoblark.terraform.github
https://github.com/sudoblark/sudoblark.terraform.modularised-demo
aws demo iac terraform
Last synced: about 2 months ago
JSON representation
An example Terraform setup using modularised components to fulfill a use-case - repo managed by sudoblark.terraform.github
- Host: GitHub
- URL: https://github.com/sudoblark/sudoblark.terraform.modularised-demo
- Owner: sudoblark
- License: bsd-3-clause
- Created: 2024-10-04T13:27:09.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2024-12-03T15:12:11.000Z (about 2 months ago)
- Last Synced: 2024-12-03T16:23:06.077Z (about 2 months ago)
- Topics: aws, demo, iac, terraform
- Language: HCL
- Homepage:
- Size: 399 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
README
sudoblark.terraform.modularised-demo
An example Terraform setup using modularised components to fulfill a use-case - repo managed by sudoblark.terraform.github
Table of Contents
## About The Project
This repo is simply a demo of how a modularised terraform setup may be utilised
in a micro-repo fashion - i.e. one repo per business-case.It's counter may be considered to be [sudoblark.terraform.github](https://github.com/sudoblark/sudoblark.terraform.github)
, which is an example mono-repo to manage all aspects of a single SaaS product in one place.For now, the repo is intended to be used in workshops/conferences to demonstrate a data-structure driven approach
to Terraform.### Built With
#### Infrastructure
* [Terraform v1.5.1](https://github.com/hashicorp/terraform/releases/tag/v1.5.1)
* [tfenv](https://github.com/tfutils/tfenv)
* [awscli](https://aws.amazon.com/cli/)#### Application code
* [Python 3.10](https://peps.python.org/pep-0619/)
* [Black](https://black.readthedocs.io/en/stable/)
* [Flake8](https://flake8.pycqa.org/en/latest/index.html)## Getting Started
Below we outlined how to interact with both Infrastructure and Application code bases.
The repo structure is relatively simple:
- `application` is the top-level for app code. Subfolders in here should be made
such that Python apps following their respective best-practices, and we
have a single source of truth for state machine JSON etc.
- `infrastructure` contains both:
- `example-account` folders, one per account, which instantiate modules
- `modules` folder to act as a top-level for re-usable Terraform modulesThis repo is intended to be used for demonstration purposes when delivering
conferences, but is also made public such that conference attendees may
query it in their own time as well.### Prerequisites
Note: Below instructions are for MacOS only, alteration may be required
to get this working on other operating systems.* tfenv
```sh
git clone https://github.com/tfutils/tfenv.git ~/.tfenv
echo 'export PATH="$HOME/.tfenv/bin:$PATH"' >> ~/.bash_profile
```* awscli
```sh
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
./aws/install
```* Virtual environment with pre-commit installed
```sh
python3 -m venv venv
source venv/bin/activate
pip install pre-commit
```* Poetry
```sh
pip install -U pip setuptools
pip install poetry
```### Pre-commit hooks
Various pre-commit hooks are in place in order to ensure consistency across the codebases.You may run these yourself as follows:
```sh
source venv/bin/activate
pip install pre-commit
pre-commit run --all-files
```## Architecture
The below sections outline architecture diagrams and explanations in order to better understand just _what_ this
demo repo both in terms of infrastructure and application workflows.### Infrastructure
```mermaid
architecture-beta
group demo(cloud)[AWS Account]service unzipLambda(server)[Unzip lambda] in demo
service rawBucket(database)[raw bucket] in demo
service processedBucket(database)[processed bucket] in demo
service bucketNotification(disk)[Bucket notification] in demorawBucket:R -- L:bucketNotification
bucketNotification:R -- L:unzipLambda
unzipLambda:R -- L:processedBucket
```Note: `s3_files` module, as you can see, is not needed for ETL. It is instead included
as it's one of the simplest data-driven modules you can have. Therefore, it's included
to be used as a simple, but useful, example in a conference/workshop setting.### Workflows
```mermaid
---
title: Demo workflow
---
flowchart TD
start((.ZIP uploaded to s3://raw/dogs/landing prefix))
bucketNotification(Bucket notification)subgraph UnzipLambda
lambdaStart(For each file in .ZIP)
startPartition(Grabs year/month/day from filename)
uploads(Uploads to s3://processed/dogs/daily/)end
sns[\Send failure notification to topic/]
endNode((Processed complete))start -- triggers --> bucketNotification
bucketNotification -- triggers --> UnzipLambdalambdaStart --> startPartition
startPartition --> uploads
uploads -- If all succeed --> endNode
uploads -- If fail --> sns
sns --> endNode
```## Usage
Below we will outline intended use-cases for the repository.
Note: This section assumes you've installed pre-requisites as
per above.### Deploying Terraform
The `main.tf` file in `example-account` is left deliberately blank, such
that this may be instantiated in any AWS Infrastructure required for
demonstration or learning purposes.Simply:
1. Navigate to the instantiation folder:
```sh
cd infrastructure/example-account
```2. Ensure your shell is authenticated to an appropriate profile for AWS
```sh
export AWS_DEFAULT_PROFILE=
```3. ZIP the lambda (Note in a production environment this would usually be done via CI/CD)
```sh
cd application/unzip-lambda/unzip_lambda
zip -r lambda.zip lambda_function.py
mkdir src
mv lambda.zip src
```4. Init, plan and then apply.
```sh
terraform init
terraform plan
terraform apply
```5. Simply tear-down when required
```sh
terraform destroy
```### Processing dummy files
Files under `application/dummy_uploads` contain the contents of the ZIP file our unzip lambda
unzips from the `raw` to `processed` bucket.Files names are in the format: `YYYYmmdd.csv` Each file corresponds to dogs viewed that day, with rows
being in the format of:```
dog_name, breed, location
```
For example, we may have a file named 20241008.csv with a single row:```
Cerberus, Molossus, Hades
```Thus indicating that on the 08th October 2024, we spotted Cerberus doing a valiant job guarding the gates to Hates.
To upload these dummy files after the solution has been deployed in to AWS:
1. First ZIP the files:
```sh
cd application/dummy_uploads
zip -r uploads.zip .
```2. Then, upload the ZIP to the `dev-raw` bucket at the `dogs/landing/` prefix:
```sh
aws s3 rm s3://dev-demo-raw/dogs/landing/uploads.zip
aws s3 cp ./application/dummy_uploads/uploads.zip s3://dev-demo-raw/dogs/landing/
```
3. This should then trigger an s3 bucket notification to run the lambda.
4. This lambda, in turn, should unzip the files and re-upload them into the `dev-processed` bucket,
at the `dogs/daily` root with prefix determined by date. i.e. for 20241008.csv we'd expect
an upload at `dogs/daily/_year=2024/_month=10/_day=08/viewings`