Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/tomgorb/some-data-monitoring
fully functional DAG using Airflow 2 and minikube (locally) to help monitor GCP billing
https://github.com/tomgorb/some-data-monitoring
airflow2 bigquery gcp minikube
Last synced: about 1 month ago
JSON representation
fully functional DAG using Airflow 2 and minikube (locally) to help monitor GCP billing
- Host: GitHub
- URL: https://github.com/tomgorb/some-data-monitoring
- Owner: tomgorb
- Created: 2024-07-24T08:58:18.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2024-07-24T09:51:26.000Z (4 months ago)
- Last Synced: 2024-09-25T23:01:29.362Z (about 2 months ago)
- Topics: airflow2, bigquery, gcp, minikube
- Language: Python
- Homepage:
- Size: 8.79 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# SOME-DATA-MONITORING β
### BIGQUERY
- Query data from `region-eu.INFORMATION_SCHEMA.JOBS_BY_PROJECT`
- OUTPUT: VIEW some_data_monitoring.BigQuery (BigQuery.sql)### GCP
> You must have [Set up Cloud Billing data export to BigQuery](https://cloud.google.com/billing/docs/how-to/export-data-bigquery-setup) first.
- Get *preprocessed* data from Billing export.
- FREQUENCY: Once an hour.
- OUTPUT: some_data_monitoring.billing## HOW TO
### LOCALLY
In order to test code locally, you need to export this environnement variable:
```shell
export GCP_SA=$(cat secret/some-data-monitoring.json)
```You have to create a local `venv` environnement in which the code must be run:
```shell
$ python3 -m venv venv
$ source venv/bin/activate
$ pip install -r src/requirements.txt
```Then you can run the following command:
```shell
$ python src/[TASK].py --conf conf/some-data-monitoring.yaml
```### END TO END ON MINIKUBE
Make sure to have Airflow2 (pip install "apache-airflow[kubernetes]") & Minikube ([minikube_latest_amd64.deb](https://storage.googleapis.com/minikube/releases/latest/minikube_latest_amd64.deb)) installed on your machine.
Start your airflow webserver & scheduler:
```shell
$ airflow webserver -p PORT
$ airflow scheduler
```(from project root directory) Start your minikube, create namespace, ConfigMap and Secret, deploy dag and build code:
```shell
$ ./dev/deploy_local.sh full
```Open the Airflow webserver at http://127.0.0.1:PORT/home