Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/nicor88/dbt-serverless
Run dbt serverless in the Cloud (AWS)
https://github.com/nicor88/dbt-serverless
aws cloud dbt ecs fargate serverless terraform
Last synced: 21 days ago
JSON representation
Run dbt serverless in the Cloud (AWS)
- Host: GitHub
- URL: https://github.com/nicor88/dbt-serverless
- Owner: nicor88
- License: mit
- Created: 2019-08-28T10:07:14.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-01-20T17:52:52.000Z (almost 5 years ago)
- Last Synced: 2024-12-02T19:08:29.314Z (25 days ago)
- Topics: aws, cloud, dbt, ecs, fargate, serverless, terraform
- Language: HCL
- Size: 2.72 MB
- Stars: 39
- Watchers: 4
- Forks: 13
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# dbt-serverless
Run dbt serverless in the Cloud (AWS)## Requirements
* aws credentials configured in `~/.aws/credentials`
* aws cli
pip install awscli
* terraform## Deploy
The infrastructure is based on terraform.
I setup a terraform backend to keep terraform state. The backend is based an S3 bucket that was created manually.
You can create an S3 bucket simply running:
aws s3api create-bucket --bucket nicor88-eu-west-1-terraform --region eu-west-1 --create-bucket-configuration LocationConstraint=eu-west-1
Remember to change the name of the S3 bucket inside `infrastructure/provider.tf` before running the following commands:
export AWS_PROFILE=your_profile
make infra-plan
make infra-applyAfter the infra is created correctly, you can push an new image to the ECR repository running:
make push-to-ecr AWS_ACCOUNT_ID=your_account_id### Note
Currently Aurora Postgres is only accessible inside the VPC.
I create a Network load balancer, to connect to the DB from everywhere, but you need to get the Private IP of Aurora Endpoint.
You can simply run:
nslookup your_aurora_enpoint
# returned from the terraform outputs
Then you need to replace the 2 variables:
* autora_postgres_serverless_private_ip_1
* autora_postgres_serverless_private_ip_2and apply again the changes with the command `make infra-apply`
## Infrastructure
### AWS Step Function
#### Input example
{
"commands1": [
"dbt",
"run",
"--models"
"example"
],
{
"commands2": [
"dbt",
"run",
"--models"
"just_another_example"
]
}
}## Airflow operator
It's possible to invoke ECS Fargate containers to run dbt also from Airflow.
Here an example of how to call a DbtOperator from Airflow:
dbt_run_example = DbtOperator(
dag=dag,
task_id='dbt_example',
command='run',
target='dev',
dbt_models='my_example',
subnets=['subnet_id_1', 'subnet_id_2'],
security_groups=['sg_1']
)