https://github.com/bufferapp/dbt-bigquery-auditlog
Get an overview of all cost and performance data for your BigQuery project
https://github.com/bufferapp/dbt-bigquery-auditlog
Last synced: 6 months ago
JSON representation
Get an overview of all cost and performance data for your BigQuery project
- Host: GitHub
- URL: https://github.com/bufferapp/dbt-bigquery-auditlog
- Owner: bufferapp
- License: mit
- Created: 2020-01-24T09:29:39.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2020-06-11T08:31:46.000Z (over 5 years ago)
- Last Synced: 2025-05-26T05:13:59.193Z (8 months ago)
- Size: 8.79 KB
- Stars: 9
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# BigQuery Auditlog
Get an overview of all cost and performance data for your BigQuery project.
## Getting Started
To use the logs from BigQuery we need to setup the [logging exports in GCP](https://cloud.google.com/bigquery/docs/reference/auditlogs/#overview).
1. Go to Stackdriver Logging _Logs Viewer_
2. Filter by `protoPayload.serviceName="bigquery.googleapis.com"` and Submit Filter
3. Click _Create Sink_, give it a name, select BigQuery as the destination and the output dataset. Check also the _Use Partitioned Tables_ option to improve performance and reduce costs in the long term.
Alternatively, you can create a logging sink using Google Cloud SDK running this command.
```bash
gcloud beta logging sinks create bigquery.googleapis.com/projects//datasets/ --log-filter='protoPayload.serviceName="bigquery.googleapis.com"'
```
### Installation
Include the following in your `packages.yml` file:
```yml
packages:
- git: "https://github.com/bufferapp/dbt-bigquery-auditlog.git"
revision: 1.0.1
```
Run `dbt deps` to install the package.
Add the source tables to your `dbt_project.yml`:
```yml
vars:
"bigquery_auditlog_dataset.data_access": TABLE
"bigquery_auditlog_dataset.activity": TABLE
```