Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/entur/terraform-aiven-kafka-connect-bigquery-sink
Terraform module for BigQuery sink connector on Aiven KafkaConnect cluster
https://github.com/entur/terraform-aiven-kafka-connect-bigquery-sink
aiven bigquery kafka-connect sink-connector terraform terraform-modules
Last synced: 20 days ago
JSON representation
Terraform module for BigQuery sink connector on Aiven KafkaConnect cluster
- Host: GitHub
- URL: https://github.com/entur/terraform-aiven-kafka-connect-bigquery-sink
- Owner: entur
- Created: 2022-10-10T12:07:04.000Z (over 2 years ago)
- Default Branch: master
- Last Pushed: 2023-10-03T12:28:21.000Z (over 1 year ago)
- Last Synced: 2024-11-17T02:13:42.812Z (3 months ago)
- Topics: aiven, bigquery, kafka-connect, sink-connector, terraform, terraform-modules
- Language: HCL
- Homepage:
- Size: 43.9 KB
- Stars: 0
- Watchers: 7
- Forks: 0
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
README
# Google BigQuery sink connector module
A terraform-module for provisioning Google BigQuery sink connector onto an Aiven managed KafkaConnect cluster.
This module depends on [Aiven kafka init module](https://github.com/entur/terraform-aiven-kafka-connect-init) to access
basic information about Aiven's KafkaConnect cluster[Module](modules/bigquery-sink)
[Example](examples/minimal/main.tf)
## Getting started
Access to Aiven terraform provider requires an API authentication token which can be generated
from [Aiven console](https://console.gcp.aiven.io/profile/auth)
Aiven authentication token can be provided as an environment variable with `TF_VAR_` prefix or in a `.tfvars` file,
otherwise from Harness Secrets Manager if you are provisioning from Harness.### BigQuery configuration
To be able to sink to a BigQuery project you need a BigQuery project and a dataset created beforehand.
And you need a service account that has BigQueryEditor access to be able to create tables inside that dataset.1. When `service_account_id` is provided, each connector will add a key to that service account and provides the key as
JSON to the connector for authentication. Key will be destroyed along with the connector.
2. When `key_file` is provided, connector will not create any new key but will use the provided one here
3. When both are provided option 1 will be applied
4. When none are provided connector does not sink any data### Example using the latest release
```
module "bigquery-sink" {
source = "github.com/entur/terraform-aiven-kafka-connect-bigquery-sink//modules/bigquery-sink?ref=v0.2.1"
...
}
```See the [`README.md`](modules/bigquery-sink/README.md) under module's subfolder for a list of supported inputs and
outputs. For examples showing how they're implemented, check the [examples](examples) subfolder.### Version constraints
You can control the version of a module dependency by adding `?ref=TAG` at the end of the source argument, as shown in
the example above. This is highly recommended. You can find a list of available
versions [here](https://github.com/entur/terraform-aiven-kafka-connect-bigquery-sink/releases).Dependency automation tools such as Renovate Bot will be able to discover new releases and suggest updates
automatically.## Reference(s)
- [Aiven's Google BigQuery sink connector](https://docs.aiven.io/docs/products/kafka/kafka-connect/howto/gcp-bigquery-sink)
- [Git repo for Kafka Connect BigQuery Connector](https://github.com/confluentinc/kafka-connect-bigquery)